[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 12033 1726867160.67330: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Isn executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 12033 1726867160.67758: Added group all to inventory 12033 1726867160.67761: Added group ungrouped to inventory 12033 1726867160.67765: Group all now contains ungrouped 12033 1726867160.67768: Examining possible inventory source: /tmp/network-5rw/inventory.yml 12033 1726867160.78750: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 12033 1726867160.78810: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 12033 1726867160.78831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 12033 1726867160.78889: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 12033 1726867160.78965: Loaded config def from plugin (inventory/script) 12033 1726867160.78968: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 12033 1726867160.79008: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 12033 1726867160.79096: Loaded config def from plugin (inventory/yaml) 12033 1726867160.79099: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 12033 1726867160.79183: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 12033 1726867160.79574: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 12033 1726867160.79580: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 12033 1726867160.79583: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 12033 1726867160.79588: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 12033 1726867160.79593: Loading data from /tmp/network-5rw/inventory.yml 12033 1726867160.79659: /tmp/network-5rw/inventory.yml was not parsable by auto 12033 1726867160.79723: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 12033 1726867160.79762: Loading data from /tmp/network-5rw/inventory.yml 12033 1726867160.79843: group all already in inventory 12033 1726867160.79850: set inventory_file for managed_node1 12033 1726867160.79854: set inventory_dir for managed_node1 12033 1726867160.79855: Added host managed_node1 to inventory 12033 1726867160.79857: Added host managed_node1 to group all 12033 1726867160.79858: set ansible_host for managed_node1 12033 1726867160.79859: set ansible_ssh_extra_args for managed_node1 12033 1726867160.79862: set inventory_file for managed_node2 12033 1726867160.79865: set inventory_dir for managed_node2 12033 1726867160.79866: Added host managed_node2 to inventory 12033 1726867160.79867: Added host managed_node2 to group all 12033 1726867160.79868: set ansible_host for managed_node2 12033 1726867160.79869: set ansible_ssh_extra_args for managed_node2 12033 1726867160.79871: set inventory_file for managed_node3 12033 1726867160.79874: set inventory_dir for managed_node3 12033 1726867160.79875: Added host managed_node3 to inventory 12033 1726867160.79876: Added host managed_node3 to group all 12033 1726867160.79876: set ansible_host for managed_node3 12033 1726867160.79879: set ansible_ssh_extra_args for managed_node3 12033 1726867160.79881: Reconcile groups and hosts in inventory. 12033 1726867160.79884: Group ungrouped now contains managed_node1 12033 1726867160.79886: Group ungrouped now contains managed_node2 12033 1726867160.79887: Group ungrouped now contains managed_node3 12033 1726867160.79958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 12033 1726867160.80075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 12033 1726867160.80124: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 12033 1726867160.80151: Loaded config def from plugin (vars/host_group_vars) 12033 1726867160.80153: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 12033 1726867160.80160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 12033 1726867160.80168: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 12033 1726867160.80212: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 12033 1726867160.80537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867160.80605: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 12033 1726867160.80637: Loaded config def from plugin (connection/local) 12033 1726867160.80639: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 12033 1726867160.81013: Loaded config def from plugin (connection/paramiko_ssh) 12033 1726867160.81016: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 12033 1726867160.81563: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12033 1726867160.81589: Loaded config def from plugin (connection/psrp) 12033 1726867160.81591: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 12033 1726867160.82000: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12033 1726867160.82023: Loaded config def from plugin (connection/ssh) 12033 1726867160.82025: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 12033 1726867160.83655: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12033 1726867160.83681: Loaded config def from plugin (connection/winrm) 12033 1726867160.83684: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 12033 1726867160.83705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 12033 1726867160.83747: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 12033 1726867160.83786: Loaded config def from plugin (shell/cmd) 12033 1726867160.83788: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 12033 1726867160.83807: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 12033 1726867160.83843: Loaded config def from plugin (shell/powershell) 12033 1726867160.83844: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 12033 1726867160.83880: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 12033 1726867160.83984: Loaded config def from plugin (shell/sh) 12033 1726867160.83986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 12033 1726867160.84009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 12033 1726867160.84082: Loaded config def from plugin (become/runas) 12033 1726867160.84083: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 12033 1726867160.84193: Loaded config def from plugin (become/su) 12033 1726867160.84195: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 12033 1726867160.84289: Loaded config def from plugin (become/sudo) 12033 1726867160.84291: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 12033 1726867160.84312: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml 12033 1726867160.84524: in VariableManager get_vars() 12033 1726867160.84538: done with get_vars() 12033 1726867160.84625: trying /usr/local/lib/python3.12/site-packages/ansible/modules 12033 1726867160.86510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 12033 1726867160.86579: in VariableManager get_vars() 12033 1726867160.86583: done with get_vars() 12033 1726867160.86585: variable 'playbook_dir' from source: magic vars 12033 1726867160.86585: variable 'ansible_playbook_python' from source: magic vars 12033 1726867160.86586: variable 'ansible_config_file' from source: magic vars 12033 1726867160.86586: variable 'groups' from source: magic vars 12033 1726867160.86587: variable 'omit' from source: magic vars 12033 1726867160.86587: variable 'ansible_version' from source: magic vars 12033 1726867160.86588: variable 'ansible_check_mode' from source: magic vars 12033 1726867160.86588: variable 'ansible_diff_mode' from source: magic vars 12033 1726867160.86589: variable 'ansible_forks' from source: magic vars 12033 1726867160.86590: variable 'ansible_inventory_sources' from source: magic vars 12033 1726867160.86590: variable 'ansible_skip_tags' from source: magic vars 12033 1726867160.86590: variable 'ansible_limit' from source: magic vars 12033 1726867160.86591: variable 'ansible_run_tags' from source: magic vars 12033 1726867160.86591: variable 'ansible_verbosity' from source: magic vars 12033 1726867160.86613: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml 12033 1726867160.86963: in VariableManager get_vars() 12033 1726867160.86974: done with get_vars() 12033 1726867160.87055: in VariableManager get_vars() 12033 1726867160.87065: done with get_vars() 12033 1726867160.87101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 12033 1726867160.87110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 12033 1726867160.87259: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 12033 1726867160.87352: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 12033 1726867160.87354: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 12033 1726867160.87374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 12033 1726867160.87393: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 12033 1726867160.87494: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 12033 1726867160.87532: Loaded config def from plugin (callback/default) 12033 1726867160.87533: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12033 1726867160.88250: Loaded config def from plugin (callback/junit) 12033 1726867160.88252: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12033 1726867160.88285: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 12033 1726867160.88323: Loaded config def from plugin (callback/minimal) 12033 1726867160.88324: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12033 1726867160.88350: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12033 1726867160.88391: Loaded config def from plugin (callback/tree) 12033 1726867160.88393: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 12033 1726867160.88467: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 12033 1726867160.88469: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_options_nm.yml ******************************************** 2 plays in /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml 12033 1726867160.88493: in VariableManager get_vars() 12033 1726867160.88501: done with get_vars() 12033 1726867160.88504: in VariableManager get_vars() 12033 1726867160.88509: done with get_vars() 12033 1726867160.88512: variable 'omit' from source: magic vars 12033 1726867160.88534: in VariableManager get_vars() 12033 1726867160.88542: done with get_vars() 12033 1726867160.88554: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_options.yml' with nm as provider] ***** 12033 1726867160.88955: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 12033 1726867160.90059: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 12033 1726867160.90085: getting the remaining hosts for this loop 12033 1726867160.90086: done getting the remaining hosts for this loop 12033 1726867160.90089: getting the next task for host managed_node3 12033 1726867160.90091: done getting next task for host managed_node3 12033 1726867160.90092: ^ task is: TASK: Gathering Facts 12033 1726867160.90093: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867160.90095: getting variables 12033 1726867160.90095: in VariableManager get_vars() 12033 1726867160.90102: Calling all_inventory to load vars for managed_node3 12033 1726867160.90104: Calling groups_inventory to load vars for managed_node3 12033 1726867160.90105: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867160.90113: Calling all_plugins_play to load vars for managed_node3 12033 1726867160.90122: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867160.90124: Calling groups_plugins_play to load vars for managed_node3 12033 1726867160.90145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867160.90176: done with get_vars() 12033 1726867160.90183: done getting variables 12033 1726867160.90226: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:6 Friday 20 September 2024 17:19:20 -0400 (0:00:00.018) 0:00:00.018 ****** 12033 1726867160.90242: entering _queue_task() for managed_node3/gather_facts 12033 1726867160.90243: Creating lock for gather_facts 12033 1726867160.90506: worker is 1 (out of 1 available) 12033 1726867160.90519: exiting _queue_task() for managed_node3/gather_facts 12033 1726867160.90530: done queuing things up, now waiting for results queue to drain 12033 1726867160.90533: waiting for pending results... 12033 1726867160.90659: running TaskExecutor() for managed_node3/TASK: Gathering Facts 12033 1726867160.90715: in run() - task 0affcac9-a3a5-74bb-502b-000000000015 12033 1726867160.90726: variable 'ansible_search_path' from source: unknown 12033 1726867160.90753: calling self._execute() 12033 1726867160.90800: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867160.90805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867160.90812: variable 'omit' from source: magic vars 12033 1726867160.90878: variable 'omit' from source: magic vars 12033 1726867160.90902: variable 'omit' from source: magic vars 12033 1726867160.90929: variable 'omit' from source: magic vars 12033 1726867160.90960: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867160.90989: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867160.91008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867160.91020: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867160.91029: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867160.91051: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867160.91054: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867160.91057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867160.91128: Set connection var ansible_pipelining to False 12033 1726867160.91135: Set connection var ansible_shell_executable to /bin/sh 12033 1726867160.91141: Set connection var ansible_timeout to 10 12033 1726867160.91146: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867160.91148: Set connection var ansible_connection to ssh 12033 1726867160.91153: Set connection var ansible_shell_type to sh 12033 1726867160.91169: variable 'ansible_shell_executable' from source: unknown 12033 1726867160.91172: variable 'ansible_connection' from source: unknown 12033 1726867160.91174: variable 'ansible_module_compression' from source: unknown 12033 1726867160.91178: variable 'ansible_shell_type' from source: unknown 12033 1726867160.91181: variable 'ansible_shell_executable' from source: unknown 12033 1726867160.91183: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867160.91188: variable 'ansible_pipelining' from source: unknown 12033 1726867160.91193: variable 'ansible_timeout' from source: unknown 12033 1726867160.91196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867160.91348: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 12033 1726867160.91356: variable 'omit' from source: magic vars 12033 1726867160.91359: starting attempt loop 12033 1726867160.91361: running the handler 12033 1726867160.91374: variable 'ansible_facts' from source: unknown 12033 1726867160.91390: _low_level_execute_command(): starting 12033 1726867160.91399: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867160.91909: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867160.91914: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867160.91917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867160.91964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867160.91968: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867160.91972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867160.92034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867160.93752: stdout chunk (state=3): >>>/root <<< 12033 1726867160.93851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867160.93881: stderr chunk (state=3): >>><<< 12033 1726867160.93884: stdout chunk (state=3): >>><<< 12033 1726867160.93910: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867160.93918: _low_level_execute_command(): starting 12033 1726867160.93923: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867160.939047-12056-28554928610670 `" && echo ansible-tmp-1726867160.939047-12056-28554928610670="` echo /root/.ansible/tmp/ansible-tmp-1726867160.939047-12056-28554928610670 `" ) && sleep 0' 12033 1726867160.94341: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867160.94376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867160.94381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867160.94383: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867160.94385: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867160.94394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867160.94437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867160.94443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867160.94445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867160.94495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867160.96389: stdout chunk (state=3): >>>ansible-tmp-1726867160.939047-12056-28554928610670=/root/.ansible/tmp/ansible-tmp-1726867160.939047-12056-28554928610670 <<< 12033 1726867160.96498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867160.96523: stderr chunk (state=3): >>><<< 12033 1726867160.96526: stdout chunk (state=3): >>><<< 12033 1726867160.96543: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867160.939047-12056-28554928610670=/root/.ansible/tmp/ansible-tmp-1726867160.939047-12056-28554928610670 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867160.96568: variable 'ansible_module_compression' from source: unknown 12033 1726867160.96613: ANSIBALLZ: Using generic lock for ansible.legacy.setup 12033 1726867160.96616: ANSIBALLZ: Acquiring lock 12033 1726867160.96619: ANSIBALLZ: Lock acquired: 139897899327968 12033 1726867160.96621: ANSIBALLZ: Creating module 12033 1726867161.18235: ANSIBALLZ: Writing module into payload 12033 1726867161.18331: ANSIBALLZ: Writing module 12033 1726867161.18347: ANSIBALLZ: Renaming module 12033 1726867161.18353: ANSIBALLZ: Done creating module 12033 1726867161.18382: variable 'ansible_facts' from source: unknown 12033 1726867161.18390: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867161.18397: _low_level_execute_command(): starting 12033 1726867161.18403: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 12033 1726867161.18860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867161.18918: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867161.18934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867161.18960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867161.18963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867161.19064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867161.20754: stdout chunk (state=3): >>>PLATFORM <<< 12033 1726867161.20827: stdout chunk (state=3): >>>Linux <<< 12033 1726867161.20843: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 12033 1726867161.20854: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 12033 1726867161.20989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867161.21026: stderr chunk (state=3): >>><<< 12033 1726867161.21028: stdout chunk (state=3): >>><<< 12033 1726867161.21084: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867161.21092 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 12033 1726867161.21096: _low_level_execute_command(): starting 12033 1726867161.21098: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 12033 1726867161.21172: Sending initial data 12033 1726867161.21175: Sent initial data (1181 bytes) 12033 1726867161.21528: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867161.21531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867161.21533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867161.21538: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867161.21544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867161.21592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867161.21597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867161.21640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867161.25036: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 12033 1726867161.25405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867161.25427: stderr chunk (state=3): >>><<< 12033 1726867161.25431: stdout chunk (state=3): >>><<< 12033 1726867161.25442: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867161.25504: variable 'ansible_facts' from source: unknown 12033 1726867161.25508: variable 'ansible_facts' from source: unknown 12033 1726867161.25515: variable 'ansible_module_compression' from source: unknown 12033 1726867161.25546: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12033 1726867161.25570: variable 'ansible_facts' from source: unknown 12033 1726867161.25693: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867160.939047-12056-28554928610670/AnsiballZ_setup.py 12033 1726867161.25791: Sending initial data 12033 1726867161.25795: Sent initial data (152 bytes) 12033 1726867161.26219: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867161.26223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867161.26225: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867161.26227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867161.26229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867161.26276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867161.26281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867161.26330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867161.27869: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 12033 1726867161.27879: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867161.27911: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867161.27956: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpktqoprl5 /root/.ansible/tmp/ansible-tmp-1726867160.939047-12056-28554928610670/AnsiballZ_setup.py <<< 12033 1726867161.27966: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867160.939047-12056-28554928610670/AnsiballZ_setup.py" <<< 12033 1726867161.28000: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpktqoprl5" to remote "/root/.ansible/tmp/ansible-tmp-1726867160.939047-12056-28554928610670/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867160.939047-12056-28554928610670/AnsiballZ_setup.py" <<< 12033 1726867161.29027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867161.29062: stderr chunk (state=3): >>><<< 12033 1726867161.29066: stdout chunk (state=3): >>><<< 12033 1726867161.29083: done transferring module to remote 12033 1726867161.29095: _low_level_execute_command(): starting 12033 1726867161.29098: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867160.939047-12056-28554928610670/ /root/.ansible/tmp/ansible-tmp-1726867160.939047-12056-28554928610670/AnsiballZ_setup.py && sleep 0' 12033 1726867161.29498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867161.29506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867161.29524: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867161.29568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867161.29571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867161.29621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867161.31392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867161.31412: stderr chunk (state=3): >>><<< 12033 1726867161.31415: stdout chunk (state=3): >>><<< 12033 1726867161.31428: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867161.31430: _low_level_execute_command(): starting 12033 1726867161.31438: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867160.939047-12056-28554928610670/AnsiballZ_setup.py && sleep 0' 12033 1726867161.31846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867161.31849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867161.31851: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867161.31854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867161.31856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867161.31907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867161.31911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867161.31963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867161.34093: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 12033 1726867161.34126: stdout chunk (state=3): >>>import _imp # builtin <<< 12033 1726867161.34157: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 12033 1726867161.34226: stdout chunk (state=3): >>>import '_io' # <<< 12033 1726867161.34231: stdout chunk (state=3): >>>import 'marshal' # <<< 12033 1726867161.34270: stdout chunk (state=3): >>>import 'posix' # <<< 12033 1726867161.34302: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 12033 1726867161.34311: stdout chunk (state=3): >>># installing zipimport hook <<< 12033 1726867161.34318: stdout chunk (state=3): >>>import 'time' # <<< 12033 1726867161.34333: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 12033 1726867161.34382: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 12033 1726867161.34392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867161.34402: stdout chunk (state=3): >>>import '_codecs' # <<< 12033 1726867161.34426: stdout chunk (state=3): >>>import 'codecs' # <<< 12033 1726867161.34467: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 12033 1726867161.34494: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71327104d0> <<< 12033 1726867161.34497: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71326dfb30> <<< 12033 1726867161.34525: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 12033 1726867161.34533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 12033 1726867161.34538: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132712a50> <<< 12033 1726867161.34565: stdout chunk (state=3): >>>import '_signal' # <<< 12033 1726867161.34593: stdout chunk (state=3): >>>import '_abc' # <<< 12033 1726867161.34597: stdout chunk (state=3): >>>import 'abc' # <<< 12033 1726867161.34603: stdout chunk (state=3): >>>import 'io' # <<< 12033 1726867161.34640: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 12033 1726867161.34726: stdout chunk (state=3): >>>import '_collections_abc' # <<< 12033 1726867161.34753: stdout chunk (state=3): >>>import 'genericpath' # <<< 12033 1726867161.34758: stdout chunk (state=3): >>>import 'posixpath' # <<< 12033 1726867161.34780: stdout chunk (state=3): >>>import 'os' # <<< 12033 1726867161.34810: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages <<< 12033 1726867161.34814: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 12033 1726867161.34838: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 12033 1726867161.34843: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 12033 1726867161.34870: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 12033 1726867161.34906: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71324e5130> <<< 12033 1726867161.34959: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 12033 1726867161.34966: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867161.34969: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71324e5fa0> <<< 12033 1726867161.34998: stdout chunk (state=3): >>>import 'site' # <<< 12033 1726867161.35028: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12033 1726867161.35400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 12033 1726867161.35404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 12033 1726867161.35438: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 12033 1726867161.35444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867161.35468: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 12033 1726867161.35500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 12033 1726867161.35521: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 12033 1726867161.35546: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 12033 1726867161.35558: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132523e00> <<< 12033 1726867161.35573: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 12033 1726867161.35597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 12033 1726867161.35611: stdout chunk (state=3): >>>import '_operator' # <<< 12033 1726867161.35630: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132523ec0> <<< 12033 1726867161.35634: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 12033 1726867161.35668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 12033 1726867161.35687: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 12033 1726867161.35738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867161.35751: stdout chunk (state=3): >>>import 'itertools' # <<< 12033 1726867161.35784: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 12033 1726867161.35796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713255b7d0> <<< 12033 1726867161.35810: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 12033 1726867161.35816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713255be60> <<< 12033 1726867161.35836: stdout chunk (state=3): >>>import '_collections' # <<< 12033 1726867161.35884: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713253bad0> <<< 12033 1726867161.35899: stdout chunk (state=3): >>>import '_functools' # <<< 12033 1726867161.35922: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325391f0> <<< 12033 1726867161.36012: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132520fb0> <<< 12033 1726867161.36035: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 12033 1726867161.36059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 12033 1726867161.36064: stdout chunk (state=3): >>>import '_sre' # <<< 12033 1726867161.36099: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 12033 1726867161.36111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 12033 1726867161.36137: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 12033 1726867161.36171: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713257b770> <<< 12033 1726867161.36186: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713257a390> <<< 12033 1726867161.36220: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713253a090> <<< 12033 1726867161.36227: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132578bc0> <<< 12033 1726867161.36289: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 12033 1726867161.36298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325b0800> <<< 12033 1726867161.36301: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132520230> <<< 12033 1726867161.36319: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 12033 1726867161.36349: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867161.36355: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71325b0cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325b0b60> <<< 12033 1726867161.36397: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71325b0ef0> <<< 12033 1726867161.36400: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713251ed50> <<< 12033 1726867161.36438: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 12033 1726867161.36442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867161.36458: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 12033 1726867161.36491: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 12033 1726867161.36510: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325b1580> <<< 12033 1726867161.36517: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325b1250> import 'importlib.machinery' # <<< 12033 1726867161.36545: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 12033 1726867161.36569: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325b2480> <<< 12033 1726867161.36579: stdout chunk (state=3): >>>import 'importlib.util' # <<< 12033 1726867161.36593: stdout chunk (state=3): >>>import 'runpy' # <<< 12033 1726867161.36607: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 12033 1726867161.36647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 12033 1726867161.36671: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 12033 1726867161.36686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325c86b0> <<< 12033 1726867161.36691: stdout chunk (state=3): >>>import 'errno' # <<< 12033 1726867161.36711: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867161.36733: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71325c9d90> <<< 12033 1726867161.36755: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 12033 1726867161.36758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 12033 1726867161.36796: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 12033 1726867161.36803: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325cac30> <<< 12033 1726867161.36854: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867161.36858: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71325cb290> <<< 12033 1726867161.36860: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325ca180> <<< 12033 1726867161.36883: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 12033 1726867161.36893: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 12033 1726867161.36925: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867161.36942: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71325cbd10> <<< 12033 1726867161.36947: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325cb440> <<< 12033 1726867161.36991: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325b24e0> <<< 12033 1726867161.37012: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 12033 1726867161.37032: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 12033 1726867161.37055: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 12033 1726867161.37072: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 12033 1726867161.37105: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71322c7bc0> <<< 12033 1726867161.37128: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 12033 1726867161.37164: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71322f06e0> <<< 12033 1726867161.37171: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71322f0440> <<< 12033 1726867161.37197: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867161.37203: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71322f0620> <<< 12033 1726867161.37230: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 12033 1726867161.37235: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 12033 1726867161.37306: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867161.37432: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867161.37442: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71322f0fe0> <<< 12033 1726867161.37558: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867161.37561: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71322f1970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71322f0890> <<< 12033 1726867161.37579: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71322c5d60> <<< 12033 1726867161.37602: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 12033 1726867161.37619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 12033 1726867161.37645: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 12033 1726867161.37657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 12033 1726867161.37673: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71322f2cf0> <<< 12033 1726867161.37683: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71322f0e60> <<< 12033 1726867161.37705: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325b2bd0> <<< 12033 1726867161.37729: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 12033 1726867161.37795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867161.37803: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 12033 1726867161.37845: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 12033 1726867161.37866: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713231f020> <<< 12033 1726867161.37920: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 12033 1726867161.37932: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867161.37959: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 12033 1726867161.37971: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 12033 1726867161.38012: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71323433e0> <<< 12033 1726867161.38034: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 12033 1726867161.38081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 12033 1726867161.38130: stdout chunk (state=3): >>>import 'ntpath' # <<< 12033 1726867161.38155: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 12033 1726867161.38159: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71323a01a0> <<< 12033 1726867161.38179: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 12033 1726867161.38206: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 12033 1726867161.38231: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 12033 1726867161.38269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 12033 1726867161.38353: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71323a28d0> <<< 12033 1726867161.38426: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71323a02c0> <<< 12033 1726867161.38464: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713236d190> <<< 12033 1726867161.38502: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 12033 1726867161.38506: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71321ad1f0> <<< 12033 1726867161.38508: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71323421e0> <<< 12033 1726867161.38528: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71322f3bf0> <<< 12033 1726867161.38690: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 12033 1726867161.38713: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f71323428a0> <<< 12033 1726867161.38959: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload__uy3atw5/ansible_ansible.legacy.setup_payload.zip' <<< 12033 1726867161.38965: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.39093: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.39122: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 12033 1726867161.39134: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 12033 1726867161.39179: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 12033 1726867161.39305: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713220ef00> import '_typing' # <<< 12033 1726867161.39556: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71321eddf0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71321ecf50> <<< 12033 1726867161.39560: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.39563: stdout chunk (state=3): >>>import 'ansible' # <<< 12033 1726867161.39565: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.39567: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.39601: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.39604: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 12033 1726867161.40997: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.42142: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713220cc50> <<< 12033 1726867161.42246: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 12033 1726867161.42461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71322467e0> <<< 12033 1726867161.42465: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132246570> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132245e80> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71322465d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713253a9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71322474d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7132247710> <<< 12033 1726867161.42485: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 12033 1726867161.42523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 12033 1726867161.42539: stdout chunk (state=3): >>>import '_locale' # <<< 12033 1726867161.42584: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132247c50> <<< 12033 1726867161.42607: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 12033 1726867161.42633: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 12033 1726867161.42675: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b29a60> <<< 12033 1726867161.42713: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b2b680> <<< 12033 1726867161.42736: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 12033 1726867161.42748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 12033 1726867161.42793: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b2bf50> <<< 12033 1726867161.42815: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 12033 1726867161.42840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b2d1f0> <<< 12033 1726867161.42862: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 12033 1726867161.42918: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 12033 1726867161.42930: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 12033 1726867161.42975: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b2fc80> <<< 12033 1726867161.43039: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867161.43044: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b2ffb0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b2df40> <<< 12033 1726867161.43096: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 12033 1726867161.43100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 12033 1726867161.43136: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 12033 1726867161.43140: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 12033 1726867161.43383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 12033 1726867161.43418: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b37bc0> import '_tokenize' # <<< 12033 1726867161.43421: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b36690> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b363f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 12033 1726867161.43442: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b36960> <<< 12033 1726867161.43471: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b2e450> <<< 12033 1726867161.43592: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b7be90> <<< 12033 1726867161.43629: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b7bfe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867161.43653: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b7da00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b7d7c0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 12033 1726867161.43685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 12033 1726867161.43739: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b7ff80> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b7e0f0> <<< 12033 1726867161.43759: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 12033 1726867161.44017: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b836e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b7ffb0> <<< 12033 1726867161.44072: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b84980> <<< 12033 1726867161.44104: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b84500> <<< 12033 1726867161.44145: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b84a40> <<< 12033 1726867161.44170: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b7c080> <<< 12033 1726867161.44244: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867161.44271: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131a100e0> <<< 12033 1726867161.44450: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131a111c0> <<< 12033 1726867161.44482: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b86840> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b87bf0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b86480> <<< 12033 1726867161.44521: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 12033 1726867161.44609: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.44699: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.44740: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 12033 1726867161.44765: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 12033 1726867161.44887: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.45007: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.45530: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.46066: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 12033 1726867161.46095: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 12033 1726867161.46145: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867161.46251: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131a154c0> <<< 12033 1726867161.46275: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131a16240> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131a113d0> <<< 12033 1726867161.46325: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 12033 1726867161.46344: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.46387: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 12033 1726867161.46538: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.46693: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 12033 1726867161.46715: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131a169c0> # zipimport: zlib available <<< 12033 1726867161.47167: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.47709: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.47730: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.47759: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 12033 1726867161.47816: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12033 1726867161.47922: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 12033 1726867161.47949: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.48007: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 12033 1726867161.48046: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 12033 1726867161.48097: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12033 1726867161.48141: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 12033 1726867161.48175: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.48379: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.48610: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 12033 1726867161.48664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 12033 1726867161.48686: stdout chunk (state=3): >>>import '_ast' # <<< 12033 1726867161.48740: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131a17350> <<< 12033 1726867161.48759: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.48825: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.48901: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 12033 1726867161.48935: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 12033 1726867161.48985: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.49019: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 12033 1726867161.49046: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.49075: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.49118: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.49176: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.49278: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 12033 1726867161.49307: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867161.49371: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131a21c70> <<< 12033 1726867161.49409: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131a1fe30> <<< 12033 1726867161.49460: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 12033 1726867161.49469: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.49616: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 12033 1726867161.49661: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867161.49695: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 12033 1726867161.49722: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 12033 1726867161.49725: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 12033 1726867161.49773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 12033 1726867161.49800: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 12033 1726867161.49823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 12033 1726867161.49859: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b0a840> <<< 12033 1726867161.50016: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131bfe510> <<< 12033 1726867161.50047: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131a21eb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131a21d00> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available <<< 12033 1726867161.50068: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 12033 1726867161.50111: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 12033 1726867161.50156: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 12033 1726867161.50169: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.50293: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12033 1726867161.50318: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12033 1726867161.50364: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.50408: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.50443: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.50492: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 12033 1726867161.50563: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.50614: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.50636: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.50679: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.50795: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 12033 1726867161.50905: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.51052: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.51132: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.51195: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867161.51249: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 12033 1726867161.51492: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 12033 1726867161.51496: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131ab5d60> <<< 12033 1726867161.51509: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71316f7e90> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71316fc230> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131a9e870> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131ab68d0> <<< 12033 1726867161.51546: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131ab4470> <<< 12033 1726867161.51556: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131ab40b0> <<< 12033 1726867161.51574: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 12033 1726867161.51646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 12033 1726867161.51653: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 12033 1726867161.51679: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 12033 1726867161.51718: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71316ff140> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71316fe9f0> <<< 12033 1726867161.51752: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867161.51768: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71316febd0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71316fde20> <<< 12033 1726867161.51795: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 12033 1726867161.51943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71316ff290> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 12033 1726867161.51991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 12033 1726867161.52025: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131765d90> <<< 12033 1726867161.52032: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71316ffd70> <<< 12033 1726867161.52056: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131ab4140> import 'ansible.module_utils.facts.timeout' # <<< 12033 1726867161.52068: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 12033 1726867161.52095: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 12033 1726867161.52109: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.52171: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.52218: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 12033 1726867161.52240: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.52293: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.52339: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 12033 1726867161.52357: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.52374: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 12033 1726867161.52389: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.52419: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.52450: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 12033 1726867161.52464: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.52510: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.52566: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 12033 1726867161.52571: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.52619: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.52662: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 12033 1726867161.52679: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.52732: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.52794: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.52849: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.52906: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 12033 1726867161.52928: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.53412: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.53961: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 12033 1726867161.54027: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 12033 1726867161.54074: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 12033 1726867161.54079: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 12033 1726867161.54135: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # <<< 12033 1726867161.54138: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.54188: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.54245: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 12033 1726867161.54248: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.54282: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.54326: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 12033 1726867161.54329: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.54351: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.54397: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 12033 1726867161.54400: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.54476: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.54607: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131767680> <<< 12033 1726867161.54623: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 12033 1726867161.54656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 12033 1726867161.54758: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131766930> <<< 12033 1726867161.54767: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 12033 1726867161.54834: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.54903: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 12033 1726867161.54916: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.55000: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.55100: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 12033 1726867161.55102: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.55162: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.55248: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 12033 1726867161.55250: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.55291: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.55339: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 12033 1726867161.55385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 12033 1726867161.55451: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867161.55507: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f713179a150> <<< 12033 1726867161.55701: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713178ad80> import 'ansible.module_utils.facts.system.python' # <<< 12033 1726867161.55710: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.55774: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.55821: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 12033 1726867161.55835: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.55916: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.56002: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.56127: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.56265: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 12033 1726867161.56288: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.56331: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.56366: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 12033 1726867161.56413: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.56495: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 12033 1726867161.56561: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71317adaf0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71317af4a0> import 'ansible.module_utils.facts.system.user' # <<< 12033 1726867161.56567: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.56616: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available <<< 12033 1726867161.56699: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 12033 1726867161.56847: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.56980: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 12033 1726867161.57092: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.57228: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12033 1726867161.57276: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 12033 1726867161.57351: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 12033 1726867161.57364: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.57487: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.57617: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 12033 1726867161.57647: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 12033 1726867161.57760: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.57899: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 12033 1726867161.57909: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.57930: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.57957: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.58509: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.59031: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 12033 1726867161.59036: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 12033 1726867161.59155: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.59258: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 12033 1726867161.59272: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.59362: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.59507: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 12033 1726867161.59654: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.60017: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 12033 1726867161.60020: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 12033 1726867161.60022: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.60120: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.60314: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.60522: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 12033 1726867161.60567: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.60573: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.60681: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 12033 1726867161.60684: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.60687: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.60689: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 12033 1726867161.60808: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.60834: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 12033 1726867161.60863: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.60885: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 12033 1726867161.60917: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.60956: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.61026: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 12033 1726867161.61042: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.61088: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.61152: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 12033 1726867161.61155: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.61405: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.61665: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 12033 1726867161.61671: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.61733: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.61797: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 12033 1726867161.61805: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.61840: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.61881: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 12033 1726867161.61890: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.61923: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.61955: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 12033 1726867161.61971: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.62001: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.62034: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 12033 1726867161.62044: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.62141: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.62232: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 12033 1726867161.62249: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 12033 1726867161.62269: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.62313: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.62358: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 12033 1726867161.62387: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.62409: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.62458: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.62502: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.62573: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.62657: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 12033 1726867161.62683: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.62723: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.62794: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 12033 1726867161.62968: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.63170: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 12033 1726867161.63182: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.63234: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.63284: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 12033 1726867161.63288: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.63330: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.63388: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 12033 1726867161.63392: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.63468: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.63562: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 12033 1726867161.63580: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.63676: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.63758: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 12033 1726867161.63816: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867161.64335: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 12033 1726867161.64364: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 12033 1726867161.64374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 12033 1726867161.64408: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71315ade20> <<< 12033 1726867161.64432: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71315a7b30> <<< 12033 1726867161.64474: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71315a70b0> <<< 12033 1726867161.76044: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 12033 1726867161.76051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 12033 1726867161.76074: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71315af530> <<< 12033 1726867161.76098: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 12033 1726867161.76113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 12033 1726867161.76138: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71315f4320> <<< 12033 1726867161.76203: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867161.76246: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 12033 1726867161.76250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71315f5940> <<< 12033 1726867161.76273: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71315f5400> <<< 12033 1726867161.76539: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 12033 1726867162.00793: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-68", "ansible_nodename": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24e9df8b51e91cc3587e46253f155b", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2984, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 547, "free": 2984}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansibl<<< 12033 1726867162.00857: stdout chunk (state=3): >>>e_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24e9df-8b51-e91c-c358-7e46253f155b", "ansible_product_uuid": "ec24e9df-8b51-e91c-c358-7e46253f155b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 398, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805559808, "block_size": 4096, "block_total": 65519099, "block_available": 63917373, "block_used": 1601726, "inode_total": 131070960, "inode_available": 131029138, "inode_used": 41822, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7uUwLUrAgQyz7a8YAgUBvVYqUHIXrD9OD4IdRIvS0wM5DjkrqZknN+lxTMZpMWg/jlXFJVvFXYt0TDIxUv3VMQ7CG9CyYmBbeuJEWvoYwb8DuKGoQjvaw9hPa0a/tdKQRUk5Ee48tJDb1/f7b8HC6W47zMa508De0NKmJpkUCxPSiwETfkKtSFi1NU3yedKOlKSYO4jtNZMDSixlZgDT5la3jcB1k7FimMu61ZL4YdRdqowrsERzsKoyoubw2+euaXWxsKU9sxogT2uxy65PoA58KxP/BEqzQxzR9t9sEvGNVBRBcuBPyFKAEMwdm8wwEuHftGIX6HVD1ZyJ1kV94Sw1QBrBKVYLOc0F2Vfxah2KpheJtfxHN+3Y3VDCJCkheMOUfJL9Uq80f2+8xs3fb05mdaTabyPG6tsrK36M4NCUEwR/rlJ3z1xlUO5AQ7JnNr6OrRQTCXiZXYW8yubiTXlPYBD02/Zw1skEHGR9bVLEEd//GNW0z8DiLO9vRib8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFKa0veb+8P6VFgxqYEtIVaL2y6+Ja4kI5pG6tlGueD6mqrC1AYcokgYEcDSMDOhGEqO5Njf6G9zjcAWiPgmZds=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIE2riHWdppRksv40oyHGkAt2lseuRiuwNlSobn5rl+/f", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "19", "second": "21", "epoch": "1726867161", "epoch_int": "1726867161", "date": "2024-09-20", "time": "17:19:21", "iso8601_micro": "2024-0<<< 12033 1726867162.00865: stdout chunk (state=3): >>>9-20T21:19:21.963905Z", "iso8601": "2024-09-20T21:19:21Z", "iso8601_basic": "20240920T171921963905", "iso8601_basic_short": "20240920T171921", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 49840 10.31.15.68 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 49840 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_local": {}, "ansible_loadavg": {"1m": 0.4560546875, "5m": 0.29443359375, "15m": 0.142578125}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off <<< 12033 1726867162.00880: stdout chunk (state=3): >>>[fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:de:45:ad:8b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.68", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:deff:fe45:ad8b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.68", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:de:45:ad:8b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.68"], "ansible_all_ipv6_addresses": ["fe80::8ff:deff:fe45:ad8b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.68", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:deff:fe45:ad8b"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12033 1726867162.01583: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 12033 1726867162.01594: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__<<< 12033 1726867162.01615: stdout chunk (state=3): >>> # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 <<< 12033 1726867162.01713: stdout chunk (state=3): >>># cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb<<< 12033 1726867162.01741: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai <<< 12033 1726867162.01782: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils <<< 12033 1726867162.01785: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd <<< 12033 1726867162.01791: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 12033 1726867162.02178: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 12033 1726867162.02229: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 12033 1726867162.02238: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 12033 1726867162.02262: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport <<< 12033 1726867162.02319: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 12033 1726867162.02335: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 12033 1726867162.02385: stdout chunk (state=3): >>># destroy selinux <<< 12033 1726867162.02397: stdout chunk (state=3): >>># destroy shutil <<< 12033 1726867162.02424: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 12033 1726867162.02475: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 12033 1726867162.02481: stdout chunk (state=3): >>># destroy _pickle # destroy queue <<< 12033 1726867162.02484: stdout chunk (state=3): >>># destroy _heapq # destroy _queue <<< 12033 1726867162.02509: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl <<< 12033 1726867162.02523: stdout chunk (state=3): >>># destroy datetime # destroy subprocess <<< 12033 1726867162.02552: stdout chunk (state=3): >>># destroy base64 # destroy _ssl <<< 12033 1726867162.02585: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 12033 1726867162.02603: stdout chunk (state=3): >>># destroy json # destroy socket # destroy struct # destroy glob <<< 12033 1726867162.02641: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection <<< 12033 1726867162.02652: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 12033 1726867162.02674: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna <<< 12033 1726867162.02719: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 12033 1726867162.02763: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 12033 1726867162.02775: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 12033 1726867162.02816: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 12033 1726867162.02834: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp <<< 12033 1726867162.02855: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 12033 1726867162.02882: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 12033 1726867162.03004: stdout chunk (state=3): >>># destroy sys.monitoring <<< 12033 1726867162.03007: stdout chunk (state=3): >>># destroy _socket <<< 12033 1726867162.03045: stdout chunk (state=3): >>># destroy _collections <<< 12033 1726867162.03048: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 12033 1726867162.03051: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 12033 1726867162.03074: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 12033 1726867162.03102: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib # destroy _typing <<< 12033 1726867162.03122: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 12033 1726867162.03150: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 12033 1726867162.03152: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 12033 1726867162.03299: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 12033 1726867162.03344: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 12033 1726867162.03363: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 12033 1726867162.03774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867162.03781: stdout chunk (state=3): >>><<< 12033 1726867162.03784: stderr chunk (state=3): >>><<< 12033 1726867162.03980: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71327104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71326dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132712a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71324e5130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71324e5fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132523e00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132523ec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713255b7d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713255be60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713253bad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325391f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132520fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713257b770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713257a390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713253a090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132578bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325b0800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132520230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71325b0cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325b0b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71325b0ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713251ed50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325b1580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325b1250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325b2480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325c86b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71325c9d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325cac30> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71325cb290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325ca180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71325cbd10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325cb440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325b24e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71322c7bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71322f06e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71322f0440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71322f0620> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71322f0fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71322f1970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71322f0890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71322c5d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71322f2cf0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71322f0e60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71325b2bd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713231f020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71323433e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71323a01a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71323a28d0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71323a02c0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713236d190> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71321ad1f0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71323421e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71322f3bf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f71323428a0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload__uy3atw5/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713220ef00> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71321eddf0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71321ecf50> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713220cc50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71322467e0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132246570> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132245e80> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71322465d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713253a9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71322474d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7132247710> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7132247c50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b29a60> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b2b680> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b2bf50> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b2d1f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b2fc80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b2ffb0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b2df40> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b37bc0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b36690> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b363f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b36960> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b2e450> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b7be90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b7bfe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b7da00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b7d7c0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b7ff80> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b7e0f0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b836e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b7ffb0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b84980> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b84500> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b84a40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b7c080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131a100e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131a111c0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b86840> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131b87bf0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b86480> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131a154c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131a16240> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131a113d0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131a169c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131a17350> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131a21c70> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131a1fe30> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131b0a840> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131bfe510> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131a21eb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131a21d00> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131ab5d60> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71316f7e90> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71316fc230> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131a9e870> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131ab68d0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131ab4470> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131ab40b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71316ff140> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71316fe9f0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71316febd0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71316fde20> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71316ff290> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7131765d90> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71316ffd70> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131ab4140> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131767680> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7131766930> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f713179a150> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f713178ad80> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71317adaf0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71317af4a0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f71315ade20> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71315a7b30> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71315a70b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71315af530> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71315f4320> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71315f5940> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f71315f5400> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-68", "ansible_nodename": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24e9df8b51e91cc3587e46253f155b", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2984, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 547, "free": 2984}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24e9df-8b51-e91c-c358-7e46253f155b", "ansible_product_uuid": "ec24e9df-8b51-e91c-c358-7e46253f155b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 398, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805559808, "block_size": 4096, "block_total": 65519099, "block_available": 63917373, "block_used": 1601726, "inode_total": 131070960, "inode_available": 131029138, "inode_used": 41822, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7uUwLUrAgQyz7a8YAgUBvVYqUHIXrD9OD4IdRIvS0wM5DjkrqZknN+lxTMZpMWg/jlXFJVvFXYt0TDIxUv3VMQ7CG9CyYmBbeuJEWvoYwb8DuKGoQjvaw9hPa0a/tdKQRUk5Ee48tJDb1/f7b8HC6W47zMa508De0NKmJpkUCxPSiwETfkKtSFi1NU3yedKOlKSYO4jtNZMDSixlZgDT5la3jcB1k7FimMu61ZL4YdRdqowrsERzsKoyoubw2+euaXWxsKU9sxogT2uxy65PoA58KxP/BEqzQxzR9t9sEvGNVBRBcuBPyFKAEMwdm8wwEuHftGIX6HVD1ZyJ1kV94Sw1QBrBKVYLOc0F2Vfxah2KpheJtfxHN+3Y3VDCJCkheMOUfJL9Uq80f2+8xs3fb05mdaTabyPG6tsrK36M4NCUEwR/rlJ3z1xlUO5AQ7JnNr6OrRQTCXiZXYW8yubiTXlPYBD02/Zw1skEHGR9bVLEEd//GNW0z8DiLO9vRib8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFKa0veb+8P6VFgxqYEtIVaL2y6+Ja4kI5pG6tlGueD6mqrC1AYcokgYEcDSMDOhGEqO5Njf6G9zjcAWiPgmZds=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIE2riHWdppRksv40oyHGkAt2lseuRiuwNlSobn5rl+/f", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "19", "second": "21", "epoch": "1726867161", "epoch_int": "1726867161", "date": "2024-09-20", "time": "17:19:21", "iso8601_micro": "2024-09-20T21:19:21.963905Z", "iso8601": "2024-09-20T21:19:21Z", "iso8601_basic": "20240920T171921963905", "iso8601_basic_short": "20240920T171921", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 49840 10.31.15.68 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 49840 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_local": {}, "ansible_loadavg": {"1m": 0.4560546875, "5m": 0.29443359375, "15m": 0.142578125}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:de:45:ad:8b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.68", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:deff:fe45:ad8b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.68", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:de:45:ad:8b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.68"], "ansible_all_ipv6_addresses": ["fe80::8ff:deff:fe45:ad8b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.68", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:deff:fe45:ad8b"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 12033 1726867162.05689: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867160.939047-12056-28554928610670/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867162.05693: _low_level_execute_command(): starting 12033 1726867162.05696: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867160.939047-12056-28554928610670/ > /dev/null 2>&1 && sleep 0' 12033 1726867162.05846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867162.05849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867162.05971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.05990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867162.06216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867162.06238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867162.08048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867162.08051: stderr chunk (state=3): >>><<< 12033 1726867162.08283: stdout chunk (state=3): >>><<< 12033 1726867162.08286: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867162.08289: handler run complete 12033 1726867162.08291: variable 'ansible_facts' from source: unknown 12033 1726867162.08321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867162.08710: variable 'ansible_facts' from source: unknown 12033 1726867162.08800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867162.08938: attempt loop complete, returning result 12033 1726867162.08948: _execute() done 12033 1726867162.08955: dumping result to json 12033 1726867162.08991: done dumping result, returning 12033 1726867162.09005: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affcac9-a3a5-74bb-502b-000000000015] 12033 1726867162.09013: sending task result for task 0affcac9-a3a5-74bb-502b-000000000015 ok: [managed_node3] 12033 1726867162.09792: no more pending results, returning what we have 12033 1726867162.09795: results queue empty 12033 1726867162.09796: checking for any_errors_fatal 12033 1726867162.09797: done checking for any_errors_fatal 12033 1726867162.09798: checking for max_fail_percentage 12033 1726867162.09799: done checking for max_fail_percentage 12033 1726867162.09800: checking to see if all hosts have failed and the running result is not ok 12033 1726867162.09801: done checking to see if all hosts have failed 12033 1726867162.09802: getting the remaining hosts for this loop 12033 1726867162.09804: done getting the remaining hosts for this loop 12033 1726867162.09811: getting the next task for host managed_node3 12033 1726867162.09817: done getting next task for host managed_node3 12033 1726867162.09819: ^ task is: TASK: meta (flush_handlers) 12033 1726867162.09821: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867162.09825: getting variables 12033 1726867162.09827: in VariableManager get_vars() 12033 1726867162.09850: Calling all_inventory to load vars for managed_node3 12033 1726867162.09852: Calling groups_inventory to load vars for managed_node3 12033 1726867162.09856: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867162.09861: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000015 12033 1726867162.09865: WORKER PROCESS EXITING 12033 1726867162.09873: Calling all_plugins_play to load vars for managed_node3 12033 1726867162.09878: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867162.09882: Calling groups_plugins_play to load vars for managed_node3 12033 1726867162.10068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867162.10269: done with get_vars() 12033 1726867162.10282: done getting variables 12033 1726867162.10338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 12033 1726867162.10401: in VariableManager get_vars() 12033 1726867162.10411: Calling all_inventory to load vars for managed_node3 12033 1726867162.10414: Calling groups_inventory to load vars for managed_node3 12033 1726867162.10416: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867162.10420: Calling all_plugins_play to load vars for managed_node3 12033 1726867162.10423: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867162.10425: Calling groups_plugins_play to load vars for managed_node3 12033 1726867162.10560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867162.10739: done with get_vars() 12033 1726867162.10752: done queuing things up, now waiting for results queue to drain 12033 1726867162.10754: results queue empty 12033 1726867162.10755: checking for any_errors_fatal 12033 1726867162.10757: done checking for any_errors_fatal 12033 1726867162.10758: checking for max_fail_percentage 12033 1726867162.10759: done checking for max_fail_percentage 12033 1726867162.10763: checking to see if all hosts have failed and the running result is not ok 12033 1726867162.10764: done checking to see if all hosts have failed 12033 1726867162.10765: getting the remaining hosts for this loop 12033 1726867162.10766: done getting the remaining hosts for this loop 12033 1726867162.10768: getting the next task for host managed_node3 12033 1726867162.10772: done getting next task for host managed_node3 12033 1726867162.10774: ^ task is: TASK: Include the task 'el_repo_setup.yml' 12033 1726867162.10776: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867162.10779: getting variables 12033 1726867162.10781: in VariableManager get_vars() 12033 1726867162.10791: Calling all_inventory to load vars for managed_node3 12033 1726867162.10793: Calling groups_inventory to load vars for managed_node3 12033 1726867162.10796: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867162.10800: Calling all_plugins_play to load vars for managed_node3 12033 1726867162.10802: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867162.10804: Calling groups_plugins_play to load vars for managed_node3 12033 1726867162.11116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867162.11332: done with get_vars() 12033 1726867162.11340: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:11 Friday 20 September 2024 17:19:22 -0400 (0:00:01.211) 0:00:01.229 ****** 12033 1726867162.11414: entering _queue_task() for managed_node3/include_tasks 12033 1726867162.11416: Creating lock for include_tasks 12033 1726867162.11781: worker is 1 (out of 1 available) 12033 1726867162.11794: exiting _queue_task() for managed_node3/include_tasks 12033 1726867162.11804: done queuing things up, now waiting for results queue to drain 12033 1726867162.11806: waiting for pending results... 12033 1726867162.11933: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 12033 1726867162.12018: in run() - task 0affcac9-a3a5-74bb-502b-000000000006 12033 1726867162.12029: variable 'ansible_search_path' from source: unknown 12033 1726867162.12082: calling self._execute() 12033 1726867162.12130: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867162.12137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867162.12283: variable 'omit' from source: magic vars 12033 1726867162.12287: _execute() done 12033 1726867162.12290: dumping result to json 12033 1726867162.12292: done dumping result, returning 12033 1726867162.12294: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [0affcac9-a3a5-74bb-502b-000000000006] 12033 1726867162.12297: sending task result for task 0affcac9-a3a5-74bb-502b-000000000006 12033 1726867162.12361: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000006 12033 1726867162.12364: WORKER PROCESS EXITING 12033 1726867162.12416: no more pending results, returning what we have 12033 1726867162.12421: in VariableManager get_vars() 12033 1726867162.12454: Calling all_inventory to load vars for managed_node3 12033 1726867162.12457: Calling groups_inventory to load vars for managed_node3 12033 1726867162.12460: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867162.12472: Calling all_plugins_play to load vars for managed_node3 12033 1726867162.12475: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867162.12480: Calling groups_plugins_play to load vars for managed_node3 12033 1726867162.12757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867162.12962: done with get_vars() 12033 1726867162.12969: variable 'ansible_search_path' from source: unknown 12033 1726867162.12984: we have included files to process 12033 1726867162.12985: generating all_blocks data 12033 1726867162.12986: done generating all_blocks data 12033 1726867162.12989: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12033 1726867162.12990: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12033 1726867162.12993: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12033 1726867162.13674: in VariableManager get_vars() 12033 1726867162.13693: done with get_vars() 12033 1726867162.13705: done processing included file 12033 1726867162.13707: iterating over new_blocks loaded from include file 12033 1726867162.13709: in VariableManager get_vars() 12033 1726867162.13717: done with get_vars() 12033 1726867162.13718: filtering new block on tags 12033 1726867162.13731: done filtering new block on tags 12033 1726867162.13734: in VariableManager get_vars() 12033 1726867162.13743: done with get_vars() 12033 1726867162.13744: filtering new block on tags 12033 1726867162.13758: done filtering new block on tags 12033 1726867162.13760: in VariableManager get_vars() 12033 1726867162.13775: done with get_vars() 12033 1726867162.13776: filtering new block on tags 12033 1726867162.13793: done filtering new block on tags 12033 1726867162.13795: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 12033 1726867162.13800: extending task lists for all hosts with included blocks 12033 1726867162.13846: done extending task lists 12033 1726867162.13847: done processing included files 12033 1726867162.13847: results queue empty 12033 1726867162.13848: checking for any_errors_fatal 12033 1726867162.13849: done checking for any_errors_fatal 12033 1726867162.13850: checking for max_fail_percentage 12033 1726867162.13851: done checking for max_fail_percentage 12033 1726867162.13851: checking to see if all hosts have failed and the running result is not ok 12033 1726867162.13852: done checking to see if all hosts have failed 12033 1726867162.13853: getting the remaining hosts for this loop 12033 1726867162.13854: done getting the remaining hosts for this loop 12033 1726867162.13856: getting the next task for host managed_node3 12033 1726867162.13859: done getting next task for host managed_node3 12033 1726867162.13861: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 12033 1726867162.13864: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867162.13865: getting variables 12033 1726867162.13866: in VariableManager get_vars() 12033 1726867162.13882: Calling all_inventory to load vars for managed_node3 12033 1726867162.13885: Calling groups_inventory to load vars for managed_node3 12033 1726867162.13887: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867162.13894: Calling all_plugins_play to load vars for managed_node3 12033 1726867162.13896: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867162.13899: Calling groups_plugins_play to load vars for managed_node3 12033 1726867162.14053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867162.14233: done with get_vars() 12033 1726867162.14241: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 17:19:22 -0400 (0:00:00.028) 0:00:01.258 ****** 12033 1726867162.14311: entering _queue_task() for managed_node3/setup 12033 1726867162.14668: worker is 1 (out of 1 available) 12033 1726867162.14680: exiting _queue_task() for managed_node3/setup 12033 1726867162.14693: done queuing things up, now waiting for results queue to drain 12033 1726867162.14694: waiting for pending results... 12033 1726867162.14838: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 12033 1726867162.14926: in run() - task 0affcac9-a3a5-74bb-502b-000000000026 12033 1726867162.14939: variable 'ansible_search_path' from source: unknown 12033 1726867162.14942: variable 'ansible_search_path' from source: unknown 12033 1726867162.14985: calling self._execute() 12033 1726867162.15060: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867162.15084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867162.15136: variable 'omit' from source: magic vars 12033 1726867162.15652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867162.17658: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867162.17706: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867162.17733: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867162.17765: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867162.17788: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867162.17846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867162.17866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867162.17885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867162.17918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867162.17929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867162.18044: variable 'ansible_facts' from source: unknown 12033 1726867162.18084: variable 'network_test_required_facts' from source: task vars 12033 1726867162.18117: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 12033 1726867162.18120: variable 'omit' from source: magic vars 12033 1726867162.18145: variable 'omit' from source: magic vars 12033 1726867162.18165: variable 'omit' from source: magic vars 12033 1726867162.18185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867162.18207: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867162.18220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867162.18237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867162.18241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867162.18264: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867162.18267: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867162.18269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867162.18334: Set connection var ansible_pipelining to False 12033 1726867162.18344: Set connection var ansible_shell_executable to /bin/sh 12033 1726867162.18347: Set connection var ansible_timeout to 10 12033 1726867162.18356: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867162.18360: Set connection var ansible_connection to ssh 12033 1726867162.18362: Set connection var ansible_shell_type to sh 12033 1726867162.18378: variable 'ansible_shell_executable' from source: unknown 12033 1726867162.18381: variable 'ansible_connection' from source: unknown 12033 1726867162.18384: variable 'ansible_module_compression' from source: unknown 12033 1726867162.18386: variable 'ansible_shell_type' from source: unknown 12033 1726867162.18388: variable 'ansible_shell_executable' from source: unknown 12033 1726867162.18394: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867162.18397: variable 'ansible_pipelining' from source: unknown 12033 1726867162.18400: variable 'ansible_timeout' from source: unknown 12033 1726867162.18404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867162.18501: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867162.18509: variable 'omit' from source: magic vars 12033 1726867162.18514: starting attempt loop 12033 1726867162.18517: running the handler 12033 1726867162.18527: _low_level_execute_command(): starting 12033 1726867162.18534: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867162.19016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867162.19020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867162.19028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.19112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867162.19156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867162.20778: stdout chunk (state=3): >>>/root <<< 12033 1726867162.20875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867162.20901: stderr chunk (state=3): >>><<< 12033 1726867162.20905: stdout chunk (state=3): >>><<< 12033 1726867162.20922: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867162.20937: _low_level_execute_command(): starting 12033 1726867162.20941: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867162.2092233-12109-9515001238243 `" && echo ansible-tmp-1726867162.2092233-12109-9515001238243="` echo /root/.ansible/tmp/ansible-tmp-1726867162.2092233-12109-9515001238243 `" ) && sleep 0' 12033 1726867162.21348: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867162.21352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.21355: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867162.21357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867162.21359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.21408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867162.21414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867162.21457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867162.23334: stdout chunk (state=3): >>>ansible-tmp-1726867162.2092233-12109-9515001238243=/root/.ansible/tmp/ansible-tmp-1726867162.2092233-12109-9515001238243 <<< 12033 1726867162.23445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867162.23469: stderr chunk (state=3): >>><<< 12033 1726867162.23472: stdout chunk (state=3): >>><<< 12033 1726867162.23486: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867162.2092233-12109-9515001238243=/root/.ansible/tmp/ansible-tmp-1726867162.2092233-12109-9515001238243 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867162.23525: variable 'ansible_module_compression' from source: unknown 12033 1726867162.23560: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12033 1726867162.23607: variable 'ansible_facts' from source: unknown 12033 1726867162.23742: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867162.2092233-12109-9515001238243/AnsiballZ_setup.py 12033 1726867162.23829: Sending initial data 12033 1726867162.23832: Sent initial data (152 bytes) 12033 1726867162.24260: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867162.24267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.24354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867162.24358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867162.24360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867162.24416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867162.26002: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867162.26130: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867162.26153: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpbzfou1n4 /root/.ansible/tmp/ansible-tmp-1726867162.2092233-12109-9515001238243/AnsiballZ_setup.py <<< 12033 1726867162.26156: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867162.2092233-12109-9515001238243/AnsiballZ_setup.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpbzfou1n4" to remote "/root/.ansible/tmp/ansible-tmp-1726867162.2092233-12109-9515001238243/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867162.2092233-12109-9515001238243/AnsiballZ_setup.py" <<< 12033 1726867162.27585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867162.27753: stderr chunk (state=3): >>><<< 12033 1726867162.27757: stdout chunk (state=3): >>><<< 12033 1726867162.27759: done transferring module to remote 12033 1726867162.27762: _low_level_execute_command(): starting 12033 1726867162.27764: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867162.2092233-12109-9515001238243/ /root/.ansible/tmp/ansible-tmp-1726867162.2092233-12109-9515001238243/AnsiballZ_setup.py && sleep 0' 12033 1726867162.28747: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867162.28753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867162.28756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.28758: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867162.28771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.28907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867162.28922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867162.29282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867162.29285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867162.30902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867162.30906: stdout chunk (state=3): >>><<< 12033 1726867162.30908: stderr chunk (state=3): >>><<< 12033 1726867162.30924: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867162.30932: _low_level_execute_command(): starting 12033 1726867162.31009: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867162.2092233-12109-9515001238243/AnsiballZ_setup.py && sleep 0' 12033 1726867162.31514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867162.31528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867162.31541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867162.31558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867162.31573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867162.31610: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867162.31783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867162.31825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867162.34085: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 12033 1726867162.34134: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 12033 1726867162.34151: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 12033 1726867162.34180: stdout chunk (state=3): >>>import 'posix' # <<< 12033 1726867162.34258: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 12033 1726867162.34266: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 12033 1726867162.34320: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867162.34412: stdout chunk (state=3): >>>import '_codecs' # <<< 12033 1726867162.34415: stdout chunk (state=3): >>>import 'codecs' # <<< 12033 1726867162.34418: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 12033 1726867162.34530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d78184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d77e7b30> <<< 12033 1726867162.34534: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d781aa50> <<< 12033 1726867162.34564: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 12033 1726867162.34579: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 12033 1726867162.34681: stdout chunk (state=3): >>>import '_collections_abc' # <<< 12033 1726867162.34817: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 12033 1726867162.34821: stdout chunk (state=3): >>>import 'os' # <<< 12033 1726867162.34823: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 12033 1726867162.34831: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 12033 1726867162.34834: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 12033 1726867162.34947: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 12033 1726867162.34950: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d762d130> <<< 12033 1726867162.34953: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867162.34955: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d762dfa0> import 'site' # <<< 12033 1726867162.35000: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12033 1726867162.35387: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 12033 1726867162.35392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 12033 1726867162.35450: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 12033 1726867162.35489: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 12033 1726867162.35542: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d766bec0> <<< 12033 1726867162.35546: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 12033 1726867162.35572: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d766bf80> <<< 12033 1726867162.35599: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 12033 1726867162.35612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 12033 1726867162.35760: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76a3830> <<< 12033 1726867162.35790: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76a3ec0> <<< 12033 1726867162.35802: stdout chunk (state=3): >>>import '_collections' # <<< 12033 1726867162.35857: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7683b60> import '_functools' # <<< 12033 1726867162.35956: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76812b0> <<< 12033 1726867162.35984: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7669070> <<< 12033 1726867162.36068: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 12033 1726867162.36109: stdout chunk (state=3): >>>import '_sre' # <<< 12033 1726867162.36182: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 12033 1726867162.36207: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7682150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76c0bc0> <<< 12033 1726867162.36246: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76f8890> <<< 12033 1726867162.36325: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d76f8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76f8bf0> <<< 12033 1726867162.36416: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d76f8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7666e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867162.36553: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76f9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76f9370> import 'importlib.machinery' # <<< 12033 1726867162.36643: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76fa540> import 'importlib.util' # <<< 12033 1726867162.36647: stdout chunk (state=3): >>>import 'runpy' # <<< 12033 1726867162.36673: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7710740> import 'errno' # <<< 12033 1726867162.36717: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d7711e20> <<< 12033 1726867162.36930: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7712cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d77132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7712210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d7713d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d77134a0> <<< 12033 1726867162.36969: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76fa4b0> <<< 12033 1726867162.37000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 12033 1726867162.37027: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 12033 1726867162.37053: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 12033 1726867162.37093: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d744bc50> <<< 12033 1726867162.37117: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 12033 1726867162.37250: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d7474710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7474470> <<< 12033 1726867162.37255: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.37284: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d7474740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.37411: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d7475070> <<< 12033 1726867162.37529: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d7475a60> <<< 12033 1726867162.37557: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7474920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7449df0> <<< 12033 1726867162.37616: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 12033 1726867162.37620: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 12033 1726867162.37673: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7476e10> <<< 12033 1726867162.37710: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7475b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76fac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 12033 1726867162.37823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 12033 1726867162.37826: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 12033 1726867162.37976: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d749f1a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 12033 1726867162.38013: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d74c3560> <<< 12033 1726867162.38038: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 12033 1726867162.38081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 12033 1726867162.38137: stdout chunk (state=3): >>>import 'ntpath' # <<< 12033 1726867162.38161: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d75242c0> <<< 12033 1726867162.38183: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 12033 1726867162.38209: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 12033 1726867162.38238: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 12033 1726867162.38273: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 12033 1726867162.38361: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7526a20> <<< 12033 1726867162.38436: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d75243e0> <<< 12033 1726867162.38471: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d74e52b0> <<< 12033 1726867162.38515: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d732d3d0> <<< 12033 1726867162.38526: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d74c2360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7477d70> <<< 12033 1726867162.38705: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 12033 1726867162.38728: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc8d732d670> <<< 12033 1726867162.39002: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_ylyeg3p2/ansible_setup_payload.zip' <<< 12033 1726867162.39017: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.39134: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.39152: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 12033 1726867162.39163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 12033 1726867162.39199: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 12033 1726867162.39271: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 12033 1726867162.39308: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7397170> <<< 12033 1726867162.39318: stdout chunk (state=3): >>>import '_typing' # <<< 12033 1726867162.39497: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7376060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d73751f0> # zipimport: zlib available <<< 12033 1726867162.39532: stdout chunk (state=3): >>>import 'ansible' # <<< 12033 1726867162.39574: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12033 1726867162.39593: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 12033 1726867162.39611: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.41012: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.42138: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 12033 1726867162.42142: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7395040> <<< 12033 1726867162.42176: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867162.42207: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 12033 1726867162.42223: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 12033 1726867162.42238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 12033 1726867162.42257: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d73c6b40> <<< 12033 1726867162.42298: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d73c68d0> <<< 12033 1726867162.42330: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d73c61e0> <<< 12033 1726867162.42353: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 12033 1726867162.42407: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d73c6630> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7397e00> <<< 12033 1726867162.42433: stdout chunk (state=3): >>>import 'atexit' # <<< 12033 1726867162.42456: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d73c7920> <<< 12033 1726867162.42493: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d73c7b60> <<< 12033 1726867162.42538: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 12033 1726867162.42542: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 12033 1726867162.42558: stdout chunk (state=3): >>>import '_locale' # <<< 12033 1726867162.42615: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d73ec050> <<< 12033 1726867162.42637: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 12033 1726867162.42685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 12033 1726867162.42695: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d2de20> <<< 12033 1726867162.42763: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d2fa40> <<< 12033 1726867162.42766: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 12033 1726867162.42802: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 12033 1726867162.42805: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d30350> <<< 12033 1726867162.42885: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 12033 1726867162.42902: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d314f0> <<< 12033 1726867162.42905: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 12033 1726867162.42946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 12033 1726867162.42966: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 12033 1726867162.42996: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d33fb0> <<< 12033 1726867162.43047: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d382f0> <<< 12033 1726867162.43089: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d32180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 12033 1726867162.43136: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 12033 1726867162.43156: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 12033 1726867162.43258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 12033 1726867162.43305: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d3bfe0> <<< 12033 1726867162.43315: stdout chunk (state=3): >>>import '_tokenize' # <<< 12033 1726867162.43371: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d3aab0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d3a810> <<< 12033 1726867162.43423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 12033 1726867162.43489: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d3ad80> <<< 12033 1726867162.43569: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d32780> <<< 12033 1726867162.43706: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d80230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d80410> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 12033 1726867162.43739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d81e80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d81c40> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 12033 1726867162.43752: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.43771: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d84350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d824b0> <<< 12033 1726867162.43788: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 12033 1726867162.43830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867162.43844: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 12033 1726867162.43870: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 12033 1726867162.43901: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d87b30> <<< 12033 1726867162.44023: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d84500> <<< 12033 1726867162.44095: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.44100: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d88bc0> <<< 12033 1726867162.44124: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d88d40> <<< 12033 1726867162.44170: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.44173: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d88c80> <<< 12033 1726867162.44217: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d805f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 12033 1726867162.44220: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 12033 1726867162.44239: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 12033 1726867162.44275: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.44295: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6c102f0> <<< 12033 1726867162.44439: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.44450: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6c116a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d8aab0> <<< 12033 1726867162.44483: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d8be30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d8a720> <<< 12033 1726867162.44524: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 12033 1726867162.44542: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.44618: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.44721: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 12033 1726867162.44760: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 12033 1726867162.44782: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.44883: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.45014: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.45525: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.46269: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 12033 1726867162.46286: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6c198e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6c1a720> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6c119a0> <<< 12033 1726867162.46314: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 12033 1726867162.46318: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.46336: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.46360: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 12033 1726867162.46509: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.46674: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6c1a450> <<< 12033 1726867162.46693: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.47132: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.47589: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.47642: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.47718: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 12033 1726867162.47731: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.47767: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.47811: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 12033 1726867162.47814: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.47872: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.47972: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 12033 1726867162.48027: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 12033 1726867162.48031: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.48067: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.48104: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 12033 1726867162.48115: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.48320: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.48547: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 12033 1726867162.48598: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 12033 1726867162.48672: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6c1b7a0> # zipimport: zlib available <<< 12033 1726867162.48747: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.48825: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 12033 1726867162.48852: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 12033 1726867162.48902: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.48950: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 12033 1726867162.48953: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.48999: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.49035: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.49097: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.49209: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 12033 1726867162.49227: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867162.49273: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6c26390> <<< 12033 1726867162.49317: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6c219d0> <<< 12033 1726867162.49350: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 12033 1726867162.49353: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.49431: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.49493: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.49504: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.49560: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 12033 1726867162.49607: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 12033 1726867162.49628: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 12033 1726867162.49691: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 12033 1726867162.49705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 12033 1726867162.49755: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d0ec90> <<< 12033 1726867162.49825: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d73f2960> <<< 12033 1726867162.49885: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6c264e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6c1b440> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 12033 1726867162.49938: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.49954: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 12033 1726867162.50036: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 12033 1726867162.50057: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 12033 1726867162.50103: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.50203: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12033 1726867162.50216: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.50247: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.50299: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.50320: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.50366: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 12033 1726867162.50385: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.50451: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.50528: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.50560: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.50574: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 12033 1726867162.50751: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.50916: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.50968: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.51016: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 12033 1726867162.51019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867162.51041: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 12033 1726867162.51053: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 12033 1726867162.51081: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 12033 1726867162.51108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6cb6720> <<< 12033 1726867162.51140: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 12033 1726867162.51149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 12033 1726867162.51168: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 12033 1726867162.51210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 12033 1726867162.51249: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 12033 1726867162.51252: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6890380> <<< 12033 1726867162.51305: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.51308: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d68905f0> <<< 12033 1726867162.51360: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6ca02c0> <<< 12033 1726867162.51393: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6cb72c0> <<< 12033 1726867162.51422: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6cb4e00> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6cb49e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 12033 1726867162.51501: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 12033 1726867162.51532: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 12033 1726867162.51572: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d68935f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6892ea0> <<< 12033 1726867162.51586: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6893080> <<< 12033 1726867162.51609: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6892300> <<< 12033 1726867162.51624: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 12033 1726867162.51725: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 12033 1726867162.51733: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d68937a0> <<< 12033 1726867162.51756: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 12033 1726867162.51784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 12033 1726867162.51810: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d68da2a0> <<< 12033 1726867162.51856: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d68d82f0> <<< 12033 1726867162.51874: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6cb4b00> <<< 12033 1726867162.51880: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 12033 1726867162.51892: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 12033 1726867162.51912: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12033 1726867162.51928: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # <<< 12033 1726867162.51940: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.51999: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.52048: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 12033 1726867162.52073: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.52115: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.52171: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 12033 1726867162.52192: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.52206: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 12033 1726867162.52210: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.52243: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.52273: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 12033 1726867162.52278: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.52332: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.52379: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 12033 1726867162.52394: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.52431: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.52475: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 12033 1726867162.52480: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.52541: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.52596: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.52656: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.52709: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 12033 1726867162.52718: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 12033 1726867162.52733: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.53195: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.53619: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 12033 1726867162.53642: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.53688: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.53742: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.53771: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.53822: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 12033 1726867162.53857: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.53892: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 12033 1726867162.53896: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.53973: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.54005: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 12033 1726867162.54022: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.54070: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.54093: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 12033 1726867162.54117: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.54154: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 12033 1726867162.54156: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.54230: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.54316: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 12033 1726867162.54320: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 12033 1726867162.54342: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d68da510> <<< 12033 1726867162.54370: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 12033 1726867162.54393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 12033 1726867162.54513: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d68db0b0> import 'ansible.module_utils.facts.system.local' # <<< 12033 1726867162.54520: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.54586: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.54649: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 12033 1726867162.54652: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.54750: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.54841: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 12033 1726867162.54847: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.54911: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.54985: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 12033 1726867162.54990: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.55034: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.55083: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 12033 1726867162.55128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 12033 1726867162.55195: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.55250: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d691e510> <<< 12033 1726867162.55445: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d690e2a0> <<< 12033 1726867162.55449: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 12033 1726867162.55460: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.55514: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.55576: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 12033 1726867162.55584: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.55661: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.55744: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.55853: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.55996: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 12033 1726867162.56017: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.56050: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.56098: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 12033 1726867162.56108: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.56156: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.56200: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 12033 1726867162.56223: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.56248: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6931fd0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d691daf0> import 'ansible.module_utils.facts.system.user' # <<< 12033 1726867162.56273: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.56304: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 12033 1726867162.56344: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.56399: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 12033 1726867162.56402: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.56544: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.56705: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 12033 1726867162.56801: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.56903: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.56941: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.57005: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 12033 1726867162.57009: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.57048: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.57051: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.57184: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.57324: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 12033 1726867162.57338: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 12033 1726867162.57455: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.57579: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 12033 1726867162.57603: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.57646: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.57650: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.58190: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.58700: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 12033 1726867162.58809: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.58918: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 12033 1726867162.58941: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.59014: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.59113: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 12033 1726867162.59276: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.59444: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 12033 1726867162.59453: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.59477: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 12033 1726867162.59509: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.59556: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 12033 1726867162.59658: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.59755: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.59952: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.60162: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 12033 1726867162.60203: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.60208: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.60256: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 12033 1726867162.60297: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.60308: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 12033 1726867162.60384: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.60469: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 12033 1726867162.60472: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.60515: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 12033 1726867162.60576: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.60643: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 12033 1726867162.60646: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.60701: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.60760: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 12033 1726867162.60763: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.61018: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.61293: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 12033 1726867162.61349: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.61415: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 12033 1726867162.61452: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.61493: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 12033 1726867162.61526: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.61575: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 12033 1726867162.61586: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.61609: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.61645: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 12033 1726867162.61656: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.61723: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.61819: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available <<< 12033 1726867162.61844: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 12033 1726867162.61898: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.61943: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 12033 1726867162.61982: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.62000: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.62041: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.62093: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.62158: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.62240: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 12033 1726867162.62258: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.62299: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.62360: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 12033 1726867162.62549: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.62741: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 12033 1726867162.62802: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.62852: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 12033 1726867162.62855: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.62901: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.62950: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 12033 1726867162.62953: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.63037: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.63135: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 12033 1726867162.63138: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.63211: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.63306: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 12033 1726867162.63383: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.63622: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 12033 1726867162.63663: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 12033 1726867162.63666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 12033 1726867162.63710: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6733740> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6730080> <<< 12033 1726867162.63759: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d67317c0> <<< 12033 1726867162.64818: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "19", "second": "22", "epoch": "1726867162", "epoch_int": "1726867162", "date": "2024-09-20", "time": "17:19:22", "iso8601_micro": "2024-09-20T21:19:22.633674Z", "iso8601": "2024-09-20T21:19:22Z", "iso8601_basic": "20240920T171922633674", "iso8601_basic_short": "20240920T171922", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-68", "ansible_nodename": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24e9df8b51e91cc3587e46253f155b", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7uUwLUrAgQyz7a8YAgUBvVYqUHIXrD9OD4IdRIvS0wM5DjkrqZknN+lxTMZpMWg/jlXFJVvFXYt0TDIxUv3VMQ7CG9CyYmBbeuJEWvoYwb8DuKGoQjvaw9hPa0a/tdKQRUk5Ee48tJDb1/f7b8HC6W47zMa508De0NKmJpkUCxPSiwETfkKtSFi1NU3yedKOlKSYO4jtNZMDSixlZgDT5la3jcB1k7FimMu61ZL4YdRdqowrsERzsKoyoubw2+euaXWxsKU9sxogT2uxy65PoA58KxP/BEqzQxzR9t9sEvGNVBRBcuBPyFKAEMwdm8wwEuHftGIX6HVD1ZyJ1kV94Sw1QBrBKVYLOc0F2Vfxah2KpheJtfxHN+3Y3VDCJCkheMOUfJL9Uq80f2+8xs3fb05mdaTabyPG6tsrK36M4NCUEwR/rlJ3z1xlUO5AQ7JnNr6OrRQTCXiZXYW8yubiTXlPYBD02/Zw1skEHGR9bVLEEd//GNW0z8DiLO9vRib8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFKa0veb+8P6VFgxqYEtIVaL2y6+Ja4kI5pG6tlGueD6mqrC1AYcokgYEcDSMDOhGEqO5Njf6G9zjcAWiPgmZds=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIE2riHWdppRksv40oyHGkAt2lseuRiuwNlSobn5rl+/f", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 49840 10.31.15.68 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 49840 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12033 1726867162.65432: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 12033 1726867162.65493: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings <<< 12033 1726867162.65515: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib <<< 12033 1726867162.65567: stdout chunk (state=3): >>># cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd <<< 12033 1726867162.65619: stdout chunk (state=3): >>># cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale <<< 12033 1726867162.65671: stdout chunk (state=3): >>># cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue <<< 12033 1726867162.65738: stdout chunk (state=3): >>># cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos <<< 12033 1726867162.65763: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 12033 1726867162.66151: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 12033 1726867162.66174: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 12033 1726867162.66220: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 12033 1726867162.66244: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 12033 1726867162.66260: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select <<< 12033 1726867162.66283: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess <<< 12033 1726867162.66285: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 12033 1726867162.66318: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 12033 1726867162.66351: stdout chunk (state=3): >>># destroy distro # destroy distro.distro <<< 12033 1726867162.66355: stdout chunk (state=3): >>># destroy argparse # destroy logging <<< 12033 1726867162.66393: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 12033 1726867162.66408: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 12033 1726867162.66425: stdout chunk (state=3): >>># destroy _pickle <<< 12033 1726867162.66440: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue <<< 12033 1726867162.66454: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util <<< 12033 1726867162.66462: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 12033 1726867162.66495: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 12033 1726867162.66518: stdout chunk (state=3): >>># destroy _ssl <<< 12033 1726867162.66540: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 12033 1726867162.66553: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios <<< 12033 1726867162.66559: stdout chunk (state=3): >>># destroy errno # destroy json <<< 12033 1726867162.66594: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 12033 1726867162.66599: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing<<< 12033 1726867162.66602: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 12033 1726867162.66646: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 12033 1726867162.66668: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader <<< 12033 1726867162.66684: stdout chunk (state=3): >>># cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 12033 1726867162.66704: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 12033 1726867162.66728: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 12033 1726867162.66732: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 12033 1726867162.66750: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 12033 1726867162.66765: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 12033 1726867162.66781: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections <<< 12033 1726867162.66796: stdout chunk (state=3): >>># cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 12033 1726867162.66812: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat <<< 12033 1726867162.66827: stdout chunk (state=3): >>># destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time<<< 12033 1726867162.66845: stdout chunk (state=3): >>> # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 12033 1726867162.66862: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 12033 1726867162.67000: stdout chunk (state=3): >>># destroy sys.monitoring <<< 12033 1726867162.67018: stdout chunk (state=3): >>># destroy _socket <<< 12033 1726867162.67025: stdout chunk (state=3): >>># destroy _collections <<< 12033 1726867162.67055: stdout chunk (state=3): >>># destroy platform <<< 12033 1726867162.67058: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 12033 1726867162.67092: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 12033 1726867162.67124: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize <<< 12033 1726867162.67130: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 12033 1726867162.67155: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 12033 1726867162.67173: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 12033 1726867162.67268: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 12033 1726867162.67280: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 12033 1726867162.67304: stdout chunk (state=3): >>># destroy time <<< 12033 1726867162.67311: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 12033 1726867162.67331: stdout chunk (state=3): >>># destroy _hashlib <<< 12033 1726867162.67349: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re <<< 12033 1726867162.67366: stdout chunk (state=3): >>># destroy itertools <<< 12033 1726867162.67390: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 12033 1726867162.67399: stdout chunk (state=3): >>># clear sys.audit hooks <<< 12033 1726867162.67803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867162.67808: stdout chunk (state=3): >>><<< 12033 1726867162.67810: stderr chunk (state=3): >>><<< 12033 1726867162.67946: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d78184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d77e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d781aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d762d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d762dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d766bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d766bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76a3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7683b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7669070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7682150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76c0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76f8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d76f8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76f8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d76f8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7666e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76f9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76f9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76fa540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7710740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d7711e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7712cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d77132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7712210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d7713d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d77134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76fa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d744bc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d7474710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7474470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d7474740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d7475070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d7475a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7474920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7449df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7476e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7475b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d76fac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d749f1a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d74c3560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d75242c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7526a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d75243e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d74e52b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d732d3d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d74c2360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7477d70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc8d732d670> # zipimport: found 103 names in '/tmp/ansible_setup_payload_ylyeg3p2/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7397170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7376060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d73751f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7395040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d73c6b40> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d73c68d0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d73c61e0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d73c6630> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d7397e00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d73c7920> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d73c7b60> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d73ec050> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d2de20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d2fa40> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d30350> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d314f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d33fb0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d382f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d32180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d3bfe0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d3aab0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d3a810> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d3ad80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d32780> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d80230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d80410> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d81e80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d81c40> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d84350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d824b0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d87b30> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d84500> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d88bc0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d88d40> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d88c80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d805f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6c102f0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6c116a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d8aab0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6d8be30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d8a720> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6c198e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6c1a720> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6c119a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6c1a450> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6c1b7a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6c26390> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6c219d0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6d0ec90> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d73f2960> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6c264e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6c1b440> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6cb6720> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6890380> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d68905f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6ca02c0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6cb72c0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6cb4e00> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6cb49e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d68935f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6892ea0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6893080> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6892300> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d68937a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d68da2a0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d68d82f0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6cb4b00> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d68da510> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d68db0b0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d691e510> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d690e2a0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6931fd0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d691daf0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc8d6733740> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d6730080> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc8d67317c0> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "19", "second": "22", "epoch": "1726867162", "epoch_int": "1726867162", "date": "2024-09-20", "time": "17:19:22", "iso8601_micro": "2024-09-20T21:19:22.633674Z", "iso8601": "2024-09-20T21:19:22Z", "iso8601_basic": "20240920T171922633674", "iso8601_basic_short": "20240920T171922", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-68", "ansible_nodename": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24e9df8b51e91cc3587e46253f155b", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7uUwLUrAgQyz7a8YAgUBvVYqUHIXrD9OD4IdRIvS0wM5DjkrqZknN+lxTMZpMWg/jlXFJVvFXYt0TDIxUv3VMQ7CG9CyYmBbeuJEWvoYwb8DuKGoQjvaw9hPa0a/tdKQRUk5Ee48tJDb1/f7b8HC6W47zMa508De0NKmJpkUCxPSiwETfkKtSFi1NU3yedKOlKSYO4jtNZMDSixlZgDT5la3jcB1k7FimMu61ZL4YdRdqowrsERzsKoyoubw2+euaXWxsKU9sxogT2uxy65PoA58KxP/BEqzQxzR9t9sEvGNVBRBcuBPyFKAEMwdm8wwEuHftGIX6HVD1ZyJ1kV94Sw1QBrBKVYLOc0F2Vfxah2KpheJtfxHN+3Y3VDCJCkheMOUfJL9Uq80f2+8xs3fb05mdaTabyPG6tsrK36M4NCUEwR/rlJ3z1xlUO5AQ7JnNr6OrRQTCXiZXYW8yubiTXlPYBD02/Zw1skEHGR9bVLEEd//GNW0z8DiLO9vRib8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFKa0veb+8P6VFgxqYEtIVaL2y6+Ja4kI5pG6tlGueD6mqrC1AYcokgYEcDSMDOhGEqO5Njf6G9zjcAWiPgmZds=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIE2riHWdppRksv40oyHGkAt2lseuRiuwNlSobn5rl+/f", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 49840 10.31.15.68 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 49840 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 12033 1726867162.69206: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867162.2092233-12109-9515001238243/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867162.69208: _low_level_execute_command(): starting 12033 1726867162.69210: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867162.2092233-12109-9515001238243/ > /dev/null 2>&1 && sleep 0' 12033 1726867162.69211: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867162.69213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867162.69214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867162.69215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867162.69217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867162.69218: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867162.69219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.69221: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867162.69224: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867162.69227: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867162.69228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867162.69230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867162.69233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867162.69236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867162.69238: stderr chunk (state=3): >>>debug2: match found <<< 12033 1726867162.69239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.69241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867162.69244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867162.69246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867162.69251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867162.70860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867162.70882: stderr chunk (state=3): >>><<< 12033 1726867162.70892: stdout chunk (state=3): >>><<< 12033 1726867162.70904: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867162.70912: handler run complete 12033 1726867162.70939: variable 'ansible_facts' from source: unknown 12033 1726867162.70973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867162.71048: variable 'ansible_facts' from source: unknown 12033 1726867162.71079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867162.71113: attempt loop complete, returning result 12033 1726867162.71116: _execute() done 12033 1726867162.71118: dumping result to json 12033 1726867162.71129: done dumping result, returning 12033 1726867162.71136: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcac9-a3a5-74bb-502b-000000000026] 12033 1726867162.71140: sending task result for task 0affcac9-a3a5-74bb-502b-000000000026 12033 1726867162.71265: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000026 12033 1726867162.71268: WORKER PROCESS EXITING ok: [managed_node3] 12033 1726867162.71365: no more pending results, returning what we have 12033 1726867162.71368: results queue empty 12033 1726867162.71369: checking for any_errors_fatal 12033 1726867162.71370: done checking for any_errors_fatal 12033 1726867162.71371: checking for max_fail_percentage 12033 1726867162.71372: done checking for max_fail_percentage 12033 1726867162.71373: checking to see if all hosts have failed and the running result is not ok 12033 1726867162.71373: done checking to see if all hosts have failed 12033 1726867162.71374: getting the remaining hosts for this loop 12033 1726867162.71376: done getting the remaining hosts for this loop 12033 1726867162.71386: getting the next task for host managed_node3 12033 1726867162.71396: done getting next task for host managed_node3 12033 1726867162.71398: ^ task is: TASK: Check if system is ostree 12033 1726867162.71400: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867162.71403: getting variables 12033 1726867162.71404: in VariableManager get_vars() 12033 1726867162.71428: Calling all_inventory to load vars for managed_node3 12033 1726867162.71431: Calling groups_inventory to load vars for managed_node3 12033 1726867162.71433: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867162.71442: Calling all_plugins_play to load vars for managed_node3 12033 1726867162.71444: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867162.71446: Calling groups_plugins_play to load vars for managed_node3 12033 1726867162.71581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867162.71749: done with get_vars() 12033 1726867162.71760: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 17:19:22 -0400 (0:00:00.575) 0:00:01.834 ****** 12033 1726867162.71857: entering _queue_task() for managed_node3/stat 12033 1726867162.72095: worker is 1 (out of 1 available) 12033 1726867162.72107: exiting _queue_task() for managed_node3/stat 12033 1726867162.72119: done queuing things up, now waiting for results queue to drain 12033 1726867162.72120: waiting for pending results... 12033 1726867162.72411: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 12033 1726867162.72469: in run() - task 0affcac9-a3a5-74bb-502b-000000000028 12033 1726867162.72472: variable 'ansible_search_path' from source: unknown 12033 1726867162.72474: variable 'ansible_search_path' from source: unknown 12033 1726867162.72480: calling self._execute() 12033 1726867162.72545: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867162.72550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867162.72567: variable 'omit' from source: magic vars 12033 1726867162.72911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867162.73073: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867162.73109: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867162.73134: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867162.73173: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867162.73236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867162.73254: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867162.73271: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867162.73293: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867162.73376: Evaluated conditional (not __network_is_ostree is defined): True 12033 1726867162.73381: variable 'omit' from source: magic vars 12033 1726867162.73407: variable 'omit' from source: magic vars 12033 1726867162.73433: variable 'omit' from source: magic vars 12033 1726867162.73451: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867162.73472: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867162.73490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867162.73502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867162.73510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867162.73533: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867162.73537: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867162.73539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867162.73604: Set connection var ansible_pipelining to False 12033 1726867162.73611: Set connection var ansible_shell_executable to /bin/sh 12033 1726867162.73618: Set connection var ansible_timeout to 10 12033 1726867162.73622: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867162.73626: Set connection var ansible_connection to ssh 12033 1726867162.73629: Set connection var ansible_shell_type to sh 12033 1726867162.73647: variable 'ansible_shell_executable' from source: unknown 12033 1726867162.73650: variable 'ansible_connection' from source: unknown 12033 1726867162.73653: variable 'ansible_module_compression' from source: unknown 12033 1726867162.73656: variable 'ansible_shell_type' from source: unknown 12033 1726867162.73658: variable 'ansible_shell_executable' from source: unknown 12033 1726867162.73660: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867162.73662: variable 'ansible_pipelining' from source: unknown 12033 1726867162.73664: variable 'ansible_timeout' from source: unknown 12033 1726867162.73668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867162.73763: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867162.73771: variable 'omit' from source: magic vars 12033 1726867162.73776: starting attempt loop 12033 1726867162.73780: running the handler 12033 1726867162.73793: _low_level_execute_command(): starting 12033 1726867162.73799: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867162.74255: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867162.74259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.74262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867162.74264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.74307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867162.74322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867162.74368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867162.75944: stdout chunk (state=3): >>>/root <<< 12033 1726867162.76046: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867162.76067: stderr chunk (state=3): >>><<< 12033 1726867162.76071: stdout chunk (state=3): >>><<< 12033 1726867162.76091: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867162.76107: _low_level_execute_command(): starting 12033 1726867162.76111: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867162.7609186-12145-63700884865306 `" && echo ansible-tmp-1726867162.7609186-12145-63700884865306="` echo /root/.ansible/tmp/ansible-tmp-1726867162.7609186-12145-63700884865306 `" ) && sleep 0' 12033 1726867162.76502: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867162.76507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.76509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867162.76511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.76556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867162.76560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867162.76562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867162.76617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867162.78497: stdout chunk (state=3): >>>ansible-tmp-1726867162.7609186-12145-63700884865306=/root/.ansible/tmp/ansible-tmp-1726867162.7609186-12145-63700884865306 <<< 12033 1726867162.78606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867162.78627: stderr chunk (state=3): >>><<< 12033 1726867162.78630: stdout chunk (state=3): >>><<< 12033 1726867162.78642: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867162.7609186-12145-63700884865306=/root/.ansible/tmp/ansible-tmp-1726867162.7609186-12145-63700884865306 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867162.78679: variable 'ansible_module_compression' from source: unknown 12033 1726867162.78724: ANSIBALLZ: Using lock for stat 12033 1726867162.78728: ANSIBALLZ: Acquiring lock 12033 1726867162.78730: ANSIBALLZ: Lock acquired: 139897899013792 12033 1726867162.78732: ANSIBALLZ: Creating module 12033 1726867162.85954: ANSIBALLZ: Writing module into payload 12033 1726867162.86016: ANSIBALLZ: Writing module 12033 1726867162.86034: ANSIBALLZ: Renaming module 12033 1726867162.86039: ANSIBALLZ: Done creating module 12033 1726867162.86052: variable 'ansible_facts' from source: unknown 12033 1726867162.86115: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867162.7609186-12145-63700884865306/AnsiballZ_stat.py 12033 1726867162.86213: Sending initial data 12033 1726867162.86217: Sent initial data (152 bytes) 12033 1726867162.86655: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867162.86658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867162.86660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.86664: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867162.86666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867162.86668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.86723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867162.86728: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867162.86731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867162.86772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867162.88345: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867162.88396: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867162.88443: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpp0hq4a20 /root/.ansible/tmp/ansible-tmp-1726867162.7609186-12145-63700884865306/AnsiballZ_stat.py <<< 12033 1726867162.88447: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867162.7609186-12145-63700884865306/AnsiballZ_stat.py" <<< 12033 1726867162.88489: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpp0hq4a20" to remote "/root/.ansible/tmp/ansible-tmp-1726867162.7609186-12145-63700884865306/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867162.7609186-12145-63700884865306/AnsiballZ_stat.py" <<< 12033 1726867162.89022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867162.89059: stderr chunk (state=3): >>><<< 12033 1726867162.89062: stdout chunk (state=3): >>><<< 12033 1726867162.89084: done transferring module to remote 12033 1726867162.89100: _low_level_execute_command(): starting 12033 1726867162.89103: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867162.7609186-12145-63700884865306/ /root/.ansible/tmp/ansible-tmp-1726867162.7609186-12145-63700884865306/AnsiballZ_stat.py && sleep 0' 12033 1726867162.89521: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867162.89524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867162.89527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.89530: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867162.89533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867162.89534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867162.89537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.89582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867162.89594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867162.89637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867162.91384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867162.91404: stderr chunk (state=3): >>><<< 12033 1726867162.91407: stdout chunk (state=3): >>><<< 12033 1726867162.91418: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867162.91420: _low_level_execute_command(): starting 12033 1726867162.91425: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867162.7609186-12145-63700884865306/AnsiballZ_stat.py && sleep 0' 12033 1726867162.91821: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867162.91824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.91827: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867162.91829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867162.91872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867162.91875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867162.91929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867162.94058: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 12033 1726867162.94095: stdout chunk (state=3): >>>import _imp # builtin <<< 12033 1726867162.94123: stdout chunk (state=3): >>>import '_thread' # <<< 12033 1726867162.94128: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 12033 1726867162.94200: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 12033 1726867162.94240: stdout chunk (state=3): >>>import 'posix' # <<< 12033 1726867162.94275: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 12033 1726867162.94307: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 12033 1726867162.94312: stdout chunk (state=3): >>># installed zipimport hook <<< 12033 1726867162.94364: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 12033 1726867162.94369: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867162.94393: stdout chunk (state=3): >>>import '_codecs' # <<< 12033 1726867162.94412: stdout chunk (state=3): >>>import 'codecs' # <<< 12033 1726867162.94446: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 12033 1726867162.94472: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 12033 1726867162.94489: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40371104d0> <<< 12033 1726867162.94507: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40370dfb30> <<< 12033 1726867162.94520: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 12033 1726867162.94525: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4037112a50> <<< 12033 1726867162.94551: stdout chunk (state=3): >>>import '_signal' # <<< 12033 1726867162.94579: stdout chunk (state=3): >>>import '_abc' # <<< 12033 1726867162.94587: stdout chunk (state=3): >>>import 'abc' # <<< 12033 1726867162.94601: stdout chunk (state=3): >>>import 'io' # <<< 12033 1726867162.94632: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 12033 1726867162.94722: stdout chunk (state=3): >>>import '_collections_abc' # <<< 12033 1726867162.94748: stdout chunk (state=3): >>>import 'genericpath' # <<< 12033 1726867162.94753: stdout chunk (state=3): >>>import 'posixpath' # <<< 12033 1726867162.94778: stdout chunk (state=3): >>>import 'os' # <<< 12033 1726867162.94804: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 12033 1726867162.94811: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 12033 1726867162.94831: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 12033 1726867162.94840: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 12033 1726867162.94845: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 12033 1726867162.94871: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 12033 1726867162.94899: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036ee5130> <<< 12033 1726867162.94958: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 12033 1726867162.94965: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867162.94973: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036ee5fa0> <<< 12033 1726867162.95004: stdout chunk (state=3): >>>import 'site' # <<< 12033 1726867162.95030: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12033 1726867162.95265: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 12033 1726867162.95269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 12033 1726867162.95298: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 12033 1726867162.95303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867162.95329: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 12033 1726867162.95366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 12033 1726867162.95392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 12033 1726867162.95410: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 12033 1726867162.95429: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f23ec0> <<< 12033 1726867162.95442: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 12033 1726867162.95462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 12033 1726867162.95491: stdout chunk (state=3): >>>import '_operator' # <<< 12033 1726867162.95499: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f23f80> <<< 12033 1726867162.95511: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 12033 1726867162.95541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 12033 1726867162.95562: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 12033 1726867162.95615: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867162.95628: stdout chunk (state=3): >>>import 'itertools' # <<< 12033 1726867162.95657: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 12033 1726867162.95667: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f5b830> <<< 12033 1726867162.95687: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 12033 1726867162.95697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f5bec0> <<< 12033 1726867162.95720: stdout chunk (state=3): >>>import '_collections' # <<< 12033 1726867162.95759: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f3bb60> <<< 12033 1726867162.95778: stdout chunk (state=3): >>>import '_functools' # <<< 12033 1726867162.95802: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f392b0> <<< 12033 1726867162.95896: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f21070> <<< 12033 1726867162.95919: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 12033 1726867162.95945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 12033 1726867162.95949: stdout chunk (state=3): >>>import '_sre' # <<< 12033 1726867162.95978: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 12033 1726867162.95998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 12033 1726867162.96022: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 12033 1726867162.96025: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 12033 1726867162.96066: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f7b7d0> <<< 12033 1726867162.96072: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f7a3f0> <<< 12033 1726867162.96107: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f3a150> <<< 12033 1726867162.96110: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f78bc0> <<< 12033 1726867162.96172: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 12033 1726867162.96178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fb0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f202f0> <<< 12033 1726867162.96206: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 12033 1726867162.96241: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036fb0d40> <<< 12033 1726867162.96248: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fb0bf0> <<< 12033 1726867162.96293: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.96298: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036fb0fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f1ee10> <<< 12033 1726867162.96334: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 12033 1726867162.96339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867162.96359: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 12033 1726867162.96400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 12033 1726867162.96409: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fb1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fb1370> <<< 12033 1726867162.96414: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 12033 1726867162.96444: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 12033 1726867162.96472: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fb2540> <<< 12033 1726867162.96490: stdout chunk (state=3): >>>import 'importlib.util' # <<< 12033 1726867162.96501: stdout chunk (state=3): >>>import 'runpy' # <<< 12033 1726867162.96513: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 12033 1726867162.96546: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 12033 1726867162.96571: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 12033 1726867162.96584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fc8740> <<< 12033 1726867162.96600: stdout chunk (state=3): >>>import 'errno' # <<< 12033 1726867162.96626: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.96631: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036fc9e20> <<< 12033 1726867162.96659: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 12033 1726867162.96664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 12033 1726867162.96699: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 12033 1726867162.96712: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fcacc0> <<< 12033 1726867162.96750: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.96762: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036fcb2f0> <<< 12033 1726867162.96773: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fca210> <<< 12033 1726867162.96784: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 12033 1726867162.96801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 12033 1726867162.96836: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.96852: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036fcbd70> <<< 12033 1726867162.96859: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fcb4a0> <<< 12033 1726867162.96900: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fb24b0> <<< 12033 1726867162.96925: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 12033 1726867162.96948: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 12033 1726867162.96969: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 12033 1726867162.96989: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 12033 1726867162.97021: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036d5bc50> <<< 12033 1726867162.97049: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 12033 1726867162.97080: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036d84710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036d84470> <<< 12033 1726867162.97107: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036d84740> <<< 12033 1726867162.97141: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 12033 1726867162.97216: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.97348: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.97351: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036d85070> <<< 12033 1726867162.97465: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867162.97469: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036d85a60> <<< 12033 1726867162.97480: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036d84920> <<< 12033 1726867162.97488: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036d59df0> <<< 12033 1726867162.97513: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 12033 1726867162.97530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 12033 1726867162.97559: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 12033 1726867162.97569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 12033 1726867162.97585: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036d86e10> <<< 12033 1726867162.97596: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036d85b50> <<< 12033 1726867162.97615: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fb2c60> <<< 12033 1726867162.97643: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 12033 1726867162.97704: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867162.97725: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 12033 1726867162.97760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 12033 1726867162.97791: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036daf1a0> <<< 12033 1726867162.97842: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 12033 1726867162.97851: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867162.97881: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 12033 1726867162.97896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 12033 1726867162.97936: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036dd3560> <<< 12033 1726867162.97962: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 12033 1726867162.98005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 12033 1726867162.98055: stdout chunk (state=3): >>>import 'ntpath' # <<< 12033 1726867162.98081: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036e342c0> <<< 12033 1726867162.98105: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 12033 1726867162.98132: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 12033 1726867162.98164: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 12033 1726867162.98198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 12033 1726867162.98286: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036e36a20> <<< 12033 1726867162.98359: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036e343e0> <<< 12033 1726867162.98398: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036df52b0> <<< 12033 1726867162.98425: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 12033 1726867162.98439: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40367293d0> <<< 12033 1726867162.98456: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036dd2360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036d87d70> <<< 12033 1726867162.98563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 12033 1726867162.98583: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4036729670> <<< 12033 1726867162.98722: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_9q9ucceo/ansible_stat_payload.zip' <<< 12033 1726867162.98727: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.98854: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.98890: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 12033 1726867162.98896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 12033 1726867162.98941: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 12033 1726867162.99011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 12033 1726867162.99048: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403677f170> <<< 12033 1726867162.99056: stdout chunk (state=3): >>>import '_typing' # <<< 12033 1726867162.99240: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403675e060> <<< 12033 1726867162.99243: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403675d1f0> # zipimport: zlib available <<< 12033 1726867162.99281: stdout chunk (state=3): >>>import 'ansible' # <<< 12033 1726867162.99287: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.99314: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.99317: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867162.99342: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 12033 1726867162.99346: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.00755: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.01860: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 12033 1726867163.01866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403677d040> <<< 12033 1726867163.01897: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867163.01926: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 12033 1726867163.01929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 12033 1726867163.01972: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 12033 1726867163.01999: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40367a6ae0> <<< 12033 1726867163.02033: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40367a6870> <<< 12033 1726867163.02059: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40367a6180> <<< 12033 1726867163.02086: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 12033 1726867163.02093: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 12033 1726867163.02129: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40367a65d0> <<< 12033 1726867163.02142: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403677fe00> <<< 12033 1726867163.02155: stdout chunk (state=3): >>>import 'atexit' # <<< 12033 1726867163.02182: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867163.02188: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40367a7890> <<< 12033 1726867163.02217: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867163.02220: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40367a7ad0> <<< 12033 1726867163.02241: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 12033 1726867163.02280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 12033 1726867163.02296: stdout chunk (state=3): >>>import '_locale' # <<< 12033 1726867163.02338: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40367a7f50> <<< 12033 1726867163.02351: stdout chunk (state=3): >>>import 'pwd' # <<< 12033 1726867163.02370: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 12033 1726867163.02398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 12033 1726867163.02434: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036611ca0> <<< 12033 1726867163.02464: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867163.02469: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036613470> <<< 12033 1726867163.02488: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 12033 1726867163.02507: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 12033 1726867163.02541: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40366142f0> <<< 12033 1726867163.02562: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 12033 1726867163.02595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc'<<< 12033 1726867163.02603: stdout chunk (state=3): >>> <<< 12033 1726867163.02608: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036615490> <<< 12033 1726867163.02636: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 12033 1726867163.02668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 12033 1726867163.02693: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 12033 1726867163.02757: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036617f80> <<< 12033 1726867163.02784: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867163.02796: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f403661c620> <<< 12033 1726867163.02799: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036616240> <<< 12033 1726867163.02827: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 12033 1726867163.02850: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 12033 1726867163.02880: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 12033 1726867163.02902: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 12033 1726867163.02928: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 12033 1726867163.02955: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 12033 1726867163.02976: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403661ff20> <<< 12033 1726867163.02980: stdout chunk (state=3): >>>import '_tokenize' # <<< 12033 1726867163.03045: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403661e9f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403661e750> <<< 12033 1726867163.03072: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 12033 1726867163.03079: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 12033 1726867163.03153: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403661ecc0> <<< 12033 1726867163.03179: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036616750> <<< 12033 1726867163.03210: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036667ef0> <<< 12033 1726867163.03239: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40366681d0> <<< 12033 1726867163.03258: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 12033 1726867163.03282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 12033 1726867163.03306: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 12033 1726867163.03338: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867163.03344: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036669c70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036669a30> <<< 12033 1726867163.03365: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 12033 1726867163.03460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 12033 1726867163.03511: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f403666c200> <<< 12033 1726867163.03518: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403666a360> <<< 12033 1726867163.03540: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 12033 1726867163.03583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867163.03606: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 12033 1726867163.03614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 12033 1726867163.03616: stdout chunk (state=3): >>>import '_string' # <<< 12033 1726867163.03664: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403666f8f0> <<< 12033 1726867163.03784: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403666c2c0> <<< 12033 1726867163.03850: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40366703e0> <<< 12033 1726867163.03881: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036670bf0> <<< 12033 1726867163.03921: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036670aa0> <<< 12033 1726867163.03946: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40366682f0> <<< 12033 1726867163.03966: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 12033 1726867163.03990: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 12033 1726867163.04008: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 12033 1726867163.04033: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 12033 1726867163.04061: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40366f8110> <<< 12033 1726867163.04202: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40366f90d0> <<< 12033 1726867163.04223: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40366728a0> <<< 12033 1726867163.04253: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036673c50> <<< 12033 1726867163.04264: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036672510> <<< 12033 1726867163.04269: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.04296: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 12033 1726867163.04310: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.04399: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.04497: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.04501: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 12033 1726867163.04519: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.04538: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 12033 1726867163.04554: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.04670: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.04793: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.05317: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.05858: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 12033 1726867163.05866: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 12033 1726867163.05869: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 12033 1726867163.05901: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 12033 1726867163.05909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867163.05962: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036501400> <<< 12033 1726867163.06043: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 12033 1726867163.06067: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036502150> <<< 12033 1726867163.06074: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40366fbfe0> <<< 12033 1726867163.06129: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 12033 1726867163.06135: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.06164: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.06169: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 12033 1726867163.06191: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.06337: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.06494: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 12033 1726867163.06500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036502930> <<< 12033 1726867163.06520: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.06971: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.07418: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.07493: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.07566: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 12033 1726867163.07571: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.07614: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.07652: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 12033 1726867163.07660: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.07729: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.07813: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 12033 1726867163.07819: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.07842: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 12033 1726867163.07857: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.07896: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.07935: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 12033 1726867163.07946: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.08176: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.08400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 12033 1726867163.08461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 12033 1726867163.08468: stdout chunk (state=3): >>>import '_ast' # <<< 12033 1726867163.08536: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40365033e0> <<< 12033 1726867163.08541: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.08617: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.08705: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 12033 1726867163.08709: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 12033 1726867163.08711: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 12033 1726867163.08732: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.08781: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.08813: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 12033 1726867163.08830: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.08869: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.08913: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.08971: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.09040: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 12033 1726867163.09074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867163.09155: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f403650e000> <<< 12033 1726867163.09192: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036509670> <<< 12033 1726867163.09222: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 12033 1726867163.09237: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.09298: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.09366: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.09390: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.09434: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 12033 1726867163.09458: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 12033 1726867163.09478: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 12033 1726867163.09499: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 12033 1726867163.09553: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 12033 1726867163.09582: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 12033 1726867163.09595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 12033 1726867163.09650: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40367fa960> <<< 12033 1726867163.09693: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40367e2630> <<< 12033 1726867163.09764: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403650e1b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403666c290> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 12033 1726867163.09782: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.09810: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.09838: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 12033 1726867163.09896: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 12033 1726867163.09910: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.09927: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 12033 1726867163.09947: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.10075: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.10264: stdout chunk (state=3): >>># zipimport: zlib available <<< 12033 1726867163.10390: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 12033 1726867163.10683: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value <<< 12033 1726867163.10700: stdout chunk (state=3): >>># clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread <<< 12033 1726867163.10708: stdout chunk (state=3): >>># cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings <<< 12033 1726867163.10739: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator <<< 12033 1726867163.10743: stdout chunk (state=3): >>># cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii <<< 12033 1726867163.10755: stdout chunk (state=3): >>># cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery <<< 12033 1726867163.10782: stdout chunk (state=3): >>># cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib <<< 12033 1726867163.10790: stdout chunk (state=3): >>># cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing <<< 12033 1726867163.10807: stdout chunk (state=3): >>># destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder <<< 12033 1726867163.10815: stdout chunk (state=3): >>># cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token <<< 12033 1726867163.10824: stdout chunk (state=3): >>># cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 <<< 12033 1726867163.10840: stdout chunk (state=3): >>># cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc <<< 12033 1726867163.10862: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing <<< 12033 1726867163.10878: stdout chunk (state=3): >>># destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 12033 1726867163.11117: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 12033 1726867163.11134: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 12033 1726867163.11153: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 12033 1726867163.11181: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii <<< 12033 1726867163.11184: stdout chunk (state=3): >>># destroy struct # destroy zlib # destroy bz2 # destroy lzma <<< 12033 1726867163.11187: stdout chunk (state=3): >>># destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob<<< 12033 1726867163.11189: stdout chunk (state=3): >>> # destroy fnmatch # destroy ipaddress <<< 12033 1726867163.11219: stdout chunk (state=3): >>># destroy ntpath <<< 12033 1726867163.11242: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 12033 1726867163.11260: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings <<< 12033 1726867163.11276: stdout chunk (state=3): >>># destroy _locale # destroy pwd <<< 12033 1726867163.11284: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess <<< 12033 1726867163.11311: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selectors # destroy errno <<< 12033 1726867163.11314: stdout chunk (state=3): >>># destroy array # destroy datetime <<< 12033 1726867163.11349: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 12033 1726867163.11352: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex <<< 12033 1726867163.11356: stdout chunk (state=3): >>># destroy subprocess <<< 12033 1726867163.11407: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 12033 1726867163.11416: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid <<< 12033 1726867163.11435: stdout chunk (state=3): >>># cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 12033 1726867163.11454: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 12033 1726867163.11470: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 12033 1726867163.11489: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 12033 1726867163.11504: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 12033 1726867163.11522: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 12033 1726867163.11537: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat <<< 12033 1726867163.11553: stdout chunk (state=3): >>># destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 12033 1726867163.11580: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 12033 1726867163.11585: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 12033 1726867163.11716: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 12033 1726867163.11744: stdout chunk (state=3): >>># destroy _collections <<< 12033 1726867163.11763: stdout chunk (state=3): >>># destroy platform <<< 12033 1726867163.11766: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 12033 1726867163.11794: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 12033 1726867163.11825: stdout chunk (state=3): >>># destroy _typing <<< 12033 1726867163.11828: stdout chunk (state=3): >>># destroy _tokenize <<< 12033 1726867163.11831: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 12033 1726867163.11833: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 12033 1726867163.11857: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 12033 1726867163.11948: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 12033 1726867163.11954: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 12033 1726867163.11957: stdout chunk (state=3): >>># destroy atexit # destroy _warnings <<< 12033 1726867163.11986: stdout chunk (state=3): >>># destroy math # destroy _bisect # destroy time # destroy _random <<< 12033 1726867163.12009: stdout chunk (state=3): >>># destroy _weakref <<< 12033 1726867163.12014: stdout chunk (state=3): >>># destroy _hashlib <<< 12033 1726867163.12022: stdout chunk (state=3): >>># destroy _operator # destroy _string # destroy re <<< 12033 1726867163.12048: stdout chunk (state=3): >>># destroy itertools <<< 12033 1726867163.12051: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix <<< 12033 1726867163.12053: stdout chunk (state=3): >>># destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 12033 1726867163.12363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867163.12396: stderr chunk (state=3): >>><<< 12033 1726867163.12400: stdout chunk (state=3): >>><<< 12033 1726867163.12470: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40371104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40370dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4037112a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036ee5130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036ee5fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f23ec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f23f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f5b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f5bec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f3bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f392b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f21070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f7b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f7a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f3a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f78bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fb0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f202f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036fb0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fb0bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036fb0fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036f1ee10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fb1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fb1370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fb2540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fc8740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036fc9e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fcacc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036fcb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fca210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036fcbd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fcb4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fb24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036d5bc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036d84710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036d84470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036d84740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036d85070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036d85a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036d84920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036d59df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036d86e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036d85b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036fb2c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036daf1a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036dd3560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036e342c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036e36a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036e343e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036df52b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40367293d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036dd2360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036d87d70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4036729670> # zipimport: found 30 names in '/tmp/ansible_stat_payload_9q9ucceo/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403677f170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403675e060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403675d1f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403677d040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40367a6ae0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40367a6870> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40367a6180> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40367a65d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403677fe00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40367a7890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40367a7ad0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40367a7f50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036611ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036613470> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40366142f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036615490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036617f80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f403661c620> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036616240> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403661ff20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403661e9f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403661e750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403661ecc0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036616750> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036667ef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40366681d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036669c70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036669a30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f403666c200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403666a360> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403666f8f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403666c2c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40366703e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036670bf0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036670aa0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40366682f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40366f8110> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f40366f90d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40366728a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036673c50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036672510> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4036501400> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036502150> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40366fbfe0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036502930> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40365033e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f403650e000> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4036509670> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40367fa960> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f40367e2630> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403650e1b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f403666c290> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 12033 1726867163.13003: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867162.7609186-12145-63700884865306/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867163.13010: _low_level_execute_command(): starting 12033 1726867163.13013: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867162.7609186-12145-63700884865306/ > /dev/null 2>&1 && sleep 0' 12033 1726867163.13120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867163.13142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867163.13144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867163.13201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867163.13204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867163.13206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867163.13246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867163.15068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867163.15096: stderr chunk (state=3): >>><<< 12033 1726867163.15099: stdout chunk (state=3): >>><<< 12033 1726867163.15111: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867163.15117: handler run complete 12033 1726867163.15133: attempt loop complete, returning result 12033 1726867163.15135: _execute() done 12033 1726867163.15138: dumping result to json 12033 1726867163.15140: done dumping result, returning 12033 1726867163.15148: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [0affcac9-a3a5-74bb-502b-000000000028] 12033 1726867163.15150: sending task result for task 0affcac9-a3a5-74bb-502b-000000000028 12033 1726867163.15234: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000028 12033 1726867163.15237: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 12033 1726867163.15305: no more pending results, returning what we have 12033 1726867163.15308: results queue empty 12033 1726867163.15309: checking for any_errors_fatal 12033 1726867163.15314: done checking for any_errors_fatal 12033 1726867163.15315: checking for max_fail_percentage 12033 1726867163.15316: done checking for max_fail_percentage 12033 1726867163.15317: checking to see if all hosts have failed and the running result is not ok 12033 1726867163.15318: done checking to see if all hosts have failed 12033 1726867163.15318: getting the remaining hosts for this loop 12033 1726867163.15320: done getting the remaining hosts for this loop 12033 1726867163.15323: getting the next task for host managed_node3 12033 1726867163.15328: done getting next task for host managed_node3 12033 1726867163.15330: ^ task is: TASK: Set flag to indicate system is ostree 12033 1726867163.15332: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867163.15336: getting variables 12033 1726867163.15337: in VariableManager get_vars() 12033 1726867163.15367: Calling all_inventory to load vars for managed_node3 12033 1726867163.15369: Calling groups_inventory to load vars for managed_node3 12033 1726867163.15372: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867163.15383: Calling all_plugins_play to load vars for managed_node3 12033 1726867163.15386: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867163.15391: Calling groups_plugins_play to load vars for managed_node3 12033 1726867163.15542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867163.15659: done with get_vars() 12033 1726867163.15667: done getting variables 12033 1726867163.15740: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 17:19:23 -0400 (0:00:00.439) 0:00:02.273 ****** 12033 1726867163.15761: entering _queue_task() for managed_node3/set_fact 12033 1726867163.15762: Creating lock for set_fact 12033 1726867163.15958: worker is 1 (out of 1 available) 12033 1726867163.15971: exiting _queue_task() for managed_node3/set_fact 12033 1726867163.15983: done queuing things up, now waiting for results queue to drain 12033 1726867163.15985: waiting for pending results... 12033 1726867163.16124: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 12033 1726867163.16176: in run() - task 0affcac9-a3a5-74bb-502b-000000000029 12033 1726867163.16188: variable 'ansible_search_path' from source: unknown 12033 1726867163.16191: variable 'ansible_search_path' from source: unknown 12033 1726867163.16224: calling self._execute() 12033 1726867163.16276: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867163.16282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867163.16291: variable 'omit' from source: magic vars 12033 1726867163.16620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867163.16834: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867163.16866: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867163.16890: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867163.16916: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867163.16974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867163.16997: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867163.17015: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867163.17032: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867163.17118: Evaluated conditional (not __network_is_ostree is defined): True 12033 1726867163.17122: variable 'omit' from source: magic vars 12033 1726867163.17145: variable 'omit' from source: magic vars 12033 1726867163.17227: variable '__ostree_booted_stat' from source: set_fact 12033 1726867163.17264: variable 'omit' from source: magic vars 12033 1726867163.17283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867163.17312: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867163.17322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867163.17335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867163.17344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867163.17365: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867163.17368: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867163.17370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867163.17441: Set connection var ansible_pipelining to False 12033 1726867163.17444: Set connection var ansible_shell_executable to /bin/sh 12033 1726867163.17452: Set connection var ansible_timeout to 10 12033 1726867163.17457: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867163.17459: Set connection var ansible_connection to ssh 12033 1726867163.17464: Set connection var ansible_shell_type to sh 12033 1726867163.17489: variable 'ansible_shell_executable' from source: unknown 12033 1726867163.17495: variable 'ansible_connection' from source: unknown 12033 1726867163.17498: variable 'ansible_module_compression' from source: unknown 12033 1726867163.17500: variable 'ansible_shell_type' from source: unknown 12033 1726867163.17502: variable 'ansible_shell_executable' from source: unknown 12033 1726867163.17504: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867163.17509: variable 'ansible_pipelining' from source: unknown 12033 1726867163.17513: variable 'ansible_timeout' from source: unknown 12033 1726867163.17515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867163.17581: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867163.17589: variable 'omit' from source: magic vars 12033 1726867163.17597: starting attempt loop 12033 1726867163.17599: running the handler 12033 1726867163.17607: handler run complete 12033 1726867163.17614: attempt loop complete, returning result 12033 1726867163.17617: _execute() done 12033 1726867163.17619: dumping result to json 12033 1726867163.17621: done dumping result, returning 12033 1726867163.17628: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [0affcac9-a3a5-74bb-502b-000000000029] 12033 1726867163.17637: sending task result for task 0affcac9-a3a5-74bb-502b-000000000029 12033 1726867163.17706: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000029 12033 1726867163.17708: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 12033 1726867163.17774: no more pending results, returning what we have 12033 1726867163.17779: results queue empty 12033 1726867163.17780: checking for any_errors_fatal 12033 1726867163.17784: done checking for any_errors_fatal 12033 1726867163.17785: checking for max_fail_percentage 12033 1726867163.17786: done checking for max_fail_percentage 12033 1726867163.17787: checking to see if all hosts have failed and the running result is not ok 12033 1726867163.17788: done checking to see if all hosts have failed 12033 1726867163.17788: getting the remaining hosts for this loop 12033 1726867163.17790: done getting the remaining hosts for this loop 12033 1726867163.17792: getting the next task for host managed_node3 12033 1726867163.17798: done getting next task for host managed_node3 12033 1726867163.17800: ^ task is: TASK: Fix CentOS6 Base repo 12033 1726867163.17802: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867163.17805: getting variables 12033 1726867163.17806: in VariableManager get_vars() 12033 1726867163.17828: Calling all_inventory to load vars for managed_node3 12033 1726867163.17830: Calling groups_inventory to load vars for managed_node3 12033 1726867163.17833: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867163.17841: Calling all_plugins_play to load vars for managed_node3 12033 1726867163.17843: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867163.17850: Calling groups_plugins_play to load vars for managed_node3 12033 1726867163.17975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867163.18089: done with get_vars() 12033 1726867163.18095: done getting variables 12033 1726867163.18169: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 17:19:23 -0400 (0:00:00.024) 0:00:02.297 ****** 12033 1726867163.18192: entering _queue_task() for managed_node3/copy 12033 1726867163.18357: worker is 1 (out of 1 available) 12033 1726867163.18369: exiting _queue_task() for managed_node3/copy 12033 1726867163.18382: done queuing things up, now waiting for results queue to drain 12033 1726867163.18383: waiting for pending results... 12033 1726867163.18512: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 12033 1726867163.18562: in run() - task 0affcac9-a3a5-74bb-502b-00000000002b 12033 1726867163.18572: variable 'ansible_search_path' from source: unknown 12033 1726867163.18575: variable 'ansible_search_path' from source: unknown 12033 1726867163.18609: calling self._execute() 12033 1726867163.18653: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867163.18658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867163.18666: variable 'omit' from source: magic vars 12033 1726867163.18980: variable 'ansible_distribution' from source: facts 12033 1726867163.18998: Evaluated conditional (ansible_distribution == 'CentOS'): True 12033 1726867163.19075: variable 'ansible_distribution_major_version' from source: facts 12033 1726867163.19083: Evaluated conditional (ansible_distribution_major_version == '6'): False 12033 1726867163.19086: when evaluation is False, skipping this task 12033 1726867163.19088: _execute() done 12033 1726867163.19094: dumping result to json 12033 1726867163.19096: done dumping result, returning 12033 1726867163.19103: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [0affcac9-a3a5-74bb-502b-00000000002b] 12033 1726867163.19105: sending task result for task 0affcac9-a3a5-74bb-502b-00000000002b 12033 1726867163.19197: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000002b 12033 1726867163.19199: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 12033 1726867163.19252: no more pending results, returning what we have 12033 1726867163.19255: results queue empty 12033 1726867163.19255: checking for any_errors_fatal 12033 1726867163.19258: done checking for any_errors_fatal 12033 1726867163.19259: checking for max_fail_percentage 12033 1726867163.19260: done checking for max_fail_percentage 12033 1726867163.19261: checking to see if all hosts have failed and the running result is not ok 12033 1726867163.19261: done checking to see if all hosts have failed 12033 1726867163.19262: getting the remaining hosts for this loop 12033 1726867163.19263: done getting the remaining hosts for this loop 12033 1726867163.19266: getting the next task for host managed_node3 12033 1726867163.19270: done getting next task for host managed_node3 12033 1726867163.19272: ^ task is: TASK: Include the task 'enable_epel.yml' 12033 1726867163.19275: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867163.19279: getting variables 12033 1726867163.19281: in VariableManager get_vars() 12033 1726867163.19302: Calling all_inventory to load vars for managed_node3 12033 1726867163.19304: Calling groups_inventory to load vars for managed_node3 12033 1726867163.19307: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867163.19314: Calling all_plugins_play to load vars for managed_node3 12033 1726867163.19316: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867163.19318: Calling groups_plugins_play to load vars for managed_node3 12033 1726867163.19415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867163.19524: done with get_vars() 12033 1726867163.19532: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 17:19:23 -0400 (0:00:00.013) 0:00:02.311 ****** 12033 1726867163.19589: entering _queue_task() for managed_node3/include_tasks 12033 1726867163.19750: worker is 1 (out of 1 available) 12033 1726867163.19763: exiting _queue_task() for managed_node3/include_tasks 12033 1726867163.19773: done queuing things up, now waiting for results queue to drain 12033 1726867163.19774: waiting for pending results... 12033 1726867163.19899: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 12033 1726867163.19950: in run() - task 0affcac9-a3a5-74bb-502b-00000000002c 12033 1726867163.19960: variable 'ansible_search_path' from source: unknown 12033 1726867163.19964: variable 'ansible_search_path' from source: unknown 12033 1726867163.19995: calling self._execute() 12033 1726867163.20044: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867163.20048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867163.20055: variable 'omit' from source: magic vars 12033 1726867163.20409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867163.21894: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867163.21934: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867163.21959: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867163.21999: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867163.22018: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867163.22071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867163.22095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867163.22113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867163.22139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867163.22149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867163.22227: variable '__network_is_ostree' from source: set_fact 12033 1726867163.22239: Evaluated conditional (not __network_is_ostree | d(false)): True 12033 1726867163.22243: _execute() done 12033 1726867163.22245: dumping result to json 12033 1726867163.22248: done dumping result, returning 12033 1726867163.22254: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [0affcac9-a3a5-74bb-502b-00000000002c] 12033 1726867163.22257: sending task result for task 0affcac9-a3a5-74bb-502b-00000000002c 12033 1726867163.22336: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000002c 12033 1726867163.22339: WORKER PROCESS EXITING 12033 1726867163.22361: no more pending results, returning what we have 12033 1726867163.22365: in VariableManager get_vars() 12033 1726867163.22398: Calling all_inventory to load vars for managed_node3 12033 1726867163.22401: Calling groups_inventory to load vars for managed_node3 12033 1726867163.22404: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867163.22412: Calling all_plugins_play to load vars for managed_node3 12033 1726867163.22414: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867163.22417: Calling groups_plugins_play to load vars for managed_node3 12033 1726867163.22565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867163.22680: done with get_vars() 12033 1726867163.22685: variable 'ansible_search_path' from source: unknown 12033 1726867163.22686: variable 'ansible_search_path' from source: unknown 12033 1726867163.22713: we have included files to process 12033 1726867163.22714: generating all_blocks data 12033 1726867163.22715: done generating all_blocks data 12033 1726867163.22718: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12033 1726867163.22719: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12033 1726867163.22720: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12033 1726867163.23166: done processing included file 12033 1726867163.23168: iterating over new_blocks loaded from include file 12033 1726867163.23169: in VariableManager get_vars() 12033 1726867163.23176: done with get_vars() 12033 1726867163.23179: filtering new block on tags 12033 1726867163.23194: done filtering new block on tags 12033 1726867163.23195: in VariableManager get_vars() 12033 1726867163.23202: done with get_vars() 12033 1726867163.23203: filtering new block on tags 12033 1726867163.23209: done filtering new block on tags 12033 1726867163.23210: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 12033 1726867163.23214: extending task lists for all hosts with included blocks 12033 1726867163.23275: done extending task lists 12033 1726867163.23276: done processing included files 12033 1726867163.23278: results queue empty 12033 1726867163.23279: checking for any_errors_fatal 12033 1726867163.23280: done checking for any_errors_fatal 12033 1726867163.23281: checking for max_fail_percentage 12033 1726867163.23281: done checking for max_fail_percentage 12033 1726867163.23282: checking to see if all hosts have failed and the running result is not ok 12033 1726867163.23282: done checking to see if all hosts have failed 12033 1726867163.23283: getting the remaining hosts for this loop 12033 1726867163.23284: done getting the remaining hosts for this loop 12033 1726867163.23285: getting the next task for host managed_node3 12033 1726867163.23289: done getting next task for host managed_node3 12033 1726867163.23290: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 12033 1726867163.23292: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867163.23293: getting variables 12033 1726867163.23294: in VariableManager get_vars() 12033 1726867163.23299: Calling all_inventory to load vars for managed_node3 12033 1726867163.23301: Calling groups_inventory to load vars for managed_node3 12033 1726867163.23302: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867163.23305: Calling all_plugins_play to load vars for managed_node3 12033 1726867163.23310: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867163.23312: Calling groups_plugins_play to load vars for managed_node3 12033 1726867163.23406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867163.23516: done with get_vars() 12033 1726867163.23522: done getting variables 12033 1726867163.23563: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 12033 1726867163.23690: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 17:19:23 -0400 (0:00:00.041) 0:00:02.352 ****** 12033 1726867163.23718: entering _queue_task() for managed_node3/command 12033 1726867163.23719: Creating lock for command 12033 1726867163.23886: worker is 1 (out of 1 available) 12033 1726867163.23901: exiting _queue_task() for managed_node3/command 12033 1726867163.23911: done queuing things up, now waiting for results queue to drain 12033 1726867163.23913: waiting for pending results... 12033 1726867163.24034: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 12033 1726867163.24095: in run() - task 0affcac9-a3a5-74bb-502b-000000000046 12033 1726867163.24102: variable 'ansible_search_path' from source: unknown 12033 1726867163.24105: variable 'ansible_search_path' from source: unknown 12033 1726867163.24130: calling self._execute() 12033 1726867163.24179: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867163.24184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867163.24192: variable 'omit' from source: magic vars 12033 1726867163.24424: variable 'ansible_distribution' from source: facts 12033 1726867163.24432: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12033 1726867163.24517: variable 'ansible_distribution_major_version' from source: facts 12033 1726867163.24521: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 12033 1726867163.24524: when evaluation is False, skipping this task 12033 1726867163.24526: _execute() done 12033 1726867163.24529: dumping result to json 12033 1726867163.24531: done dumping result, returning 12033 1726867163.24538: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [0affcac9-a3a5-74bb-502b-000000000046] 12033 1726867163.24542: sending task result for task 0affcac9-a3a5-74bb-502b-000000000046 12033 1726867163.24629: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000046 12033 1726867163.24632: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 12033 1726867163.24674: no more pending results, returning what we have 12033 1726867163.24676: results queue empty 12033 1726867163.24679: checking for any_errors_fatal 12033 1726867163.24680: done checking for any_errors_fatal 12033 1726867163.24681: checking for max_fail_percentage 12033 1726867163.24682: done checking for max_fail_percentage 12033 1726867163.24683: checking to see if all hosts have failed and the running result is not ok 12033 1726867163.24683: done checking to see if all hosts have failed 12033 1726867163.24684: getting the remaining hosts for this loop 12033 1726867163.24685: done getting the remaining hosts for this loop 12033 1726867163.24690: getting the next task for host managed_node3 12033 1726867163.24695: done getting next task for host managed_node3 12033 1726867163.24697: ^ task is: TASK: Install yum-utils package 12033 1726867163.24700: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867163.24702: getting variables 12033 1726867163.24703: in VariableManager get_vars() 12033 1726867163.24724: Calling all_inventory to load vars for managed_node3 12033 1726867163.24727: Calling groups_inventory to load vars for managed_node3 12033 1726867163.24729: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867163.24736: Calling all_plugins_play to load vars for managed_node3 12033 1726867163.24737: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867163.24739: Calling groups_plugins_play to load vars for managed_node3 12033 1726867163.24941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867163.25049: done with get_vars() 12033 1726867163.25056: done getting variables 12033 1726867163.25119: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 17:19:23 -0400 (0:00:00.014) 0:00:02.367 ****** 12033 1726867163.25136: entering _queue_task() for managed_node3/package 12033 1726867163.25137: Creating lock for package 12033 1726867163.25301: worker is 1 (out of 1 available) 12033 1726867163.25314: exiting _queue_task() for managed_node3/package 12033 1726867163.25325: done queuing things up, now waiting for results queue to drain 12033 1726867163.25326: waiting for pending results... 12033 1726867163.25446: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 12033 1726867163.25508: in run() - task 0affcac9-a3a5-74bb-502b-000000000047 12033 1726867163.25518: variable 'ansible_search_path' from source: unknown 12033 1726867163.25521: variable 'ansible_search_path' from source: unknown 12033 1726867163.25545: calling self._execute() 12033 1726867163.25597: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867163.25602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867163.25610: variable 'omit' from source: magic vars 12033 1726867163.25845: variable 'ansible_distribution' from source: facts 12033 1726867163.25853: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12033 1726867163.25938: variable 'ansible_distribution_major_version' from source: facts 12033 1726867163.25941: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 12033 1726867163.25944: when evaluation is False, skipping this task 12033 1726867163.25947: _execute() done 12033 1726867163.25949: dumping result to json 12033 1726867163.25953: done dumping result, returning 12033 1726867163.25959: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [0affcac9-a3a5-74bb-502b-000000000047] 12033 1726867163.25964: sending task result for task 0affcac9-a3a5-74bb-502b-000000000047 12033 1726867163.26045: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000047 12033 1726867163.26048: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 12033 1726867163.26097: no more pending results, returning what we have 12033 1726867163.26100: results queue empty 12033 1726867163.26100: checking for any_errors_fatal 12033 1726867163.26106: done checking for any_errors_fatal 12033 1726867163.26106: checking for max_fail_percentage 12033 1726867163.26108: done checking for max_fail_percentage 12033 1726867163.26108: checking to see if all hosts have failed and the running result is not ok 12033 1726867163.26109: done checking to see if all hosts have failed 12033 1726867163.26110: getting the remaining hosts for this loop 12033 1726867163.26111: done getting the remaining hosts for this loop 12033 1726867163.26114: getting the next task for host managed_node3 12033 1726867163.26118: done getting next task for host managed_node3 12033 1726867163.26120: ^ task is: TASK: Enable EPEL 7 12033 1726867163.26123: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867163.26126: getting variables 12033 1726867163.26127: in VariableManager get_vars() 12033 1726867163.26148: Calling all_inventory to load vars for managed_node3 12033 1726867163.26150: Calling groups_inventory to load vars for managed_node3 12033 1726867163.26153: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867163.26162: Calling all_plugins_play to load vars for managed_node3 12033 1726867163.26165: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867163.26167: Calling groups_plugins_play to load vars for managed_node3 12033 1726867163.26265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867163.26380: done with get_vars() 12033 1726867163.26389: done getting variables 12033 1726867163.26424: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 17:19:23 -0400 (0:00:00.013) 0:00:02.380 ****** 12033 1726867163.26443: entering _queue_task() for managed_node3/command 12033 1726867163.26607: worker is 1 (out of 1 available) 12033 1726867163.26619: exiting _queue_task() for managed_node3/command 12033 1726867163.26630: done queuing things up, now waiting for results queue to drain 12033 1726867163.26631: waiting for pending results... 12033 1726867163.26765: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 12033 1726867163.26820: in run() - task 0affcac9-a3a5-74bb-502b-000000000048 12033 1726867163.26831: variable 'ansible_search_path' from source: unknown 12033 1726867163.26835: variable 'ansible_search_path' from source: unknown 12033 1726867163.26861: calling self._execute() 12033 1726867163.26911: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867163.26915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867163.26923: variable 'omit' from source: magic vars 12033 1726867163.27165: variable 'ansible_distribution' from source: facts 12033 1726867163.27174: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12033 1726867163.27259: variable 'ansible_distribution_major_version' from source: facts 12033 1726867163.27263: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 12033 1726867163.27266: when evaluation is False, skipping this task 12033 1726867163.27269: _execute() done 12033 1726867163.27272: dumping result to json 12033 1726867163.27274: done dumping result, returning 12033 1726867163.27282: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [0affcac9-a3a5-74bb-502b-000000000048] 12033 1726867163.27300: sending task result for task 0affcac9-a3a5-74bb-502b-000000000048 12033 1726867163.27363: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000048 12033 1726867163.27365: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 12033 1726867163.27435: no more pending results, returning what we have 12033 1726867163.27438: results queue empty 12033 1726867163.27439: checking for any_errors_fatal 12033 1726867163.27443: done checking for any_errors_fatal 12033 1726867163.27444: checking for max_fail_percentage 12033 1726867163.27445: done checking for max_fail_percentage 12033 1726867163.27445: checking to see if all hosts have failed and the running result is not ok 12033 1726867163.27446: done checking to see if all hosts have failed 12033 1726867163.27447: getting the remaining hosts for this loop 12033 1726867163.27448: done getting the remaining hosts for this loop 12033 1726867163.27451: getting the next task for host managed_node3 12033 1726867163.27455: done getting next task for host managed_node3 12033 1726867163.27458: ^ task is: TASK: Enable EPEL 8 12033 1726867163.27461: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867163.27464: getting variables 12033 1726867163.27465: in VariableManager get_vars() 12033 1726867163.27485: Calling all_inventory to load vars for managed_node3 12033 1726867163.27486: Calling groups_inventory to load vars for managed_node3 12033 1726867163.27490: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867163.27496: Calling all_plugins_play to load vars for managed_node3 12033 1726867163.27498: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867163.27500: Calling groups_plugins_play to load vars for managed_node3 12033 1726867163.27634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867163.27745: done with get_vars() 12033 1726867163.27751: done getting variables 12033 1726867163.27789: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 17:19:23 -0400 (0:00:00.013) 0:00:02.393 ****** 12033 1726867163.27807: entering _queue_task() for managed_node3/command 12033 1726867163.27973: worker is 1 (out of 1 available) 12033 1726867163.27986: exiting _queue_task() for managed_node3/command 12033 1726867163.27999: done queuing things up, now waiting for results queue to drain 12033 1726867163.28001: waiting for pending results... 12033 1726867163.28123: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 12033 1726867163.28178: in run() - task 0affcac9-a3a5-74bb-502b-000000000049 12033 1726867163.28191: variable 'ansible_search_path' from source: unknown 12033 1726867163.28194: variable 'ansible_search_path' from source: unknown 12033 1726867163.28218: calling self._execute() 12033 1726867163.28267: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867163.28271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867163.28280: variable 'omit' from source: magic vars 12033 1726867163.28523: variable 'ansible_distribution' from source: facts 12033 1726867163.28532: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12033 1726867163.28615: variable 'ansible_distribution_major_version' from source: facts 12033 1726867163.28619: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 12033 1726867163.28622: when evaluation is False, skipping this task 12033 1726867163.28624: _execute() done 12033 1726867163.28628: dumping result to json 12033 1726867163.28631: done dumping result, returning 12033 1726867163.28636: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [0affcac9-a3a5-74bb-502b-000000000049] 12033 1726867163.28641: sending task result for task 0affcac9-a3a5-74bb-502b-000000000049 12033 1726867163.28718: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000049 12033 1726867163.28721: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 12033 1726867163.28761: no more pending results, returning what we have 12033 1726867163.28763: results queue empty 12033 1726867163.28764: checking for any_errors_fatal 12033 1726867163.28768: done checking for any_errors_fatal 12033 1726867163.28769: checking for max_fail_percentage 12033 1726867163.28770: done checking for max_fail_percentage 12033 1726867163.28771: checking to see if all hosts have failed and the running result is not ok 12033 1726867163.28771: done checking to see if all hosts have failed 12033 1726867163.28772: getting the remaining hosts for this loop 12033 1726867163.28773: done getting the remaining hosts for this loop 12033 1726867163.28776: getting the next task for host managed_node3 12033 1726867163.28784: done getting next task for host managed_node3 12033 1726867163.28786: ^ task is: TASK: Enable EPEL 6 12033 1726867163.28791: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867163.28794: getting variables 12033 1726867163.28795: in VariableManager get_vars() 12033 1726867163.28816: Calling all_inventory to load vars for managed_node3 12033 1726867163.28818: Calling groups_inventory to load vars for managed_node3 12033 1726867163.28821: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867163.28828: Calling all_plugins_play to load vars for managed_node3 12033 1726867163.28830: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867163.28832: Calling groups_plugins_play to load vars for managed_node3 12033 1726867163.28933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867163.29043: done with get_vars() 12033 1726867163.29049: done getting variables 12033 1726867163.29086: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 17:19:23 -0400 (0:00:00.012) 0:00:02.406 ****** 12033 1726867163.29105: entering _queue_task() for managed_node3/copy 12033 1726867163.29260: worker is 1 (out of 1 available) 12033 1726867163.29273: exiting _queue_task() for managed_node3/copy 12033 1726867163.29283: done queuing things up, now waiting for results queue to drain 12033 1726867163.29285: waiting for pending results... 12033 1726867163.29412: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 12033 1726867163.29461: in run() - task 0affcac9-a3a5-74bb-502b-00000000004b 12033 1726867163.29471: variable 'ansible_search_path' from source: unknown 12033 1726867163.29474: variable 'ansible_search_path' from source: unknown 12033 1726867163.29507: calling self._execute() 12033 1726867163.29549: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867163.29553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867163.29561: variable 'omit' from source: magic vars 12033 1726867163.29799: variable 'ansible_distribution' from source: facts 12033 1726867163.29807: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12033 1726867163.29882: variable 'ansible_distribution_major_version' from source: facts 12033 1726867163.29890: Evaluated conditional (ansible_distribution_major_version == '6'): False 12033 1726867163.29893: when evaluation is False, skipping this task 12033 1726867163.29896: _execute() done 12033 1726867163.29899: dumping result to json 12033 1726867163.29901: done dumping result, returning 12033 1726867163.29904: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [0affcac9-a3a5-74bb-502b-00000000004b] 12033 1726867163.29909: sending task result for task 0affcac9-a3a5-74bb-502b-00000000004b 12033 1726867163.29992: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000004b 12033 1726867163.29995: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 12033 1726867163.30038: no more pending results, returning what we have 12033 1726867163.30041: results queue empty 12033 1726867163.30042: checking for any_errors_fatal 12033 1726867163.30046: done checking for any_errors_fatal 12033 1726867163.30046: checking for max_fail_percentage 12033 1726867163.30048: done checking for max_fail_percentage 12033 1726867163.30048: checking to see if all hosts have failed and the running result is not ok 12033 1726867163.30049: done checking to see if all hosts have failed 12033 1726867163.30050: getting the remaining hosts for this loop 12033 1726867163.30051: done getting the remaining hosts for this loop 12033 1726867163.30053: getting the next task for host managed_node3 12033 1726867163.30059: done getting next task for host managed_node3 12033 1726867163.30061: ^ task is: TASK: Set network provider to 'nm' 12033 1726867163.30063: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867163.30067: getting variables 12033 1726867163.30068: in VariableManager get_vars() 12033 1726867163.30092: Calling all_inventory to load vars for managed_node3 12033 1726867163.30094: Calling groups_inventory to load vars for managed_node3 12033 1726867163.30095: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867163.30101: Calling all_plugins_play to load vars for managed_node3 12033 1726867163.30103: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867163.30110: Calling groups_plugins_play to load vars for managed_node3 12033 1726867163.30233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867163.30341: done with get_vars() 12033 1726867163.30347: done getting variables 12033 1726867163.30383: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:13 Friday 20 September 2024 17:19:23 -0400 (0:00:00.012) 0:00:02.419 ****** 12033 1726867163.30401: entering _queue_task() for managed_node3/set_fact 12033 1726867163.30556: worker is 1 (out of 1 available) 12033 1726867163.30567: exiting _queue_task() for managed_node3/set_fact 12033 1726867163.30580: done queuing things up, now waiting for results queue to drain 12033 1726867163.30581: waiting for pending results... 12033 1726867163.30702: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 12033 1726867163.30743: in run() - task 0affcac9-a3a5-74bb-502b-000000000007 12033 1726867163.30754: variable 'ansible_search_path' from source: unknown 12033 1726867163.30780: calling self._execute() 12033 1726867163.30834: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867163.30839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867163.30846: variable 'omit' from source: magic vars 12033 1726867163.30917: variable 'omit' from source: magic vars 12033 1726867163.30938: variable 'omit' from source: magic vars 12033 1726867163.30961: variable 'omit' from source: magic vars 12033 1726867163.30993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867163.31020: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867163.31034: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867163.31047: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867163.31056: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867163.31080: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867163.31084: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867163.31086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867163.31155: Set connection var ansible_pipelining to False 12033 1726867163.31161: Set connection var ansible_shell_executable to /bin/sh 12033 1726867163.31168: Set connection var ansible_timeout to 10 12033 1726867163.31173: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867163.31175: Set connection var ansible_connection to ssh 12033 1726867163.31182: Set connection var ansible_shell_type to sh 12033 1726867163.31199: variable 'ansible_shell_executable' from source: unknown 12033 1726867163.31201: variable 'ansible_connection' from source: unknown 12033 1726867163.31204: variable 'ansible_module_compression' from source: unknown 12033 1726867163.31206: variable 'ansible_shell_type' from source: unknown 12033 1726867163.31208: variable 'ansible_shell_executable' from source: unknown 12033 1726867163.31212: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867163.31216: variable 'ansible_pipelining' from source: unknown 12033 1726867163.31218: variable 'ansible_timeout' from source: unknown 12033 1726867163.31222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867163.31316: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867163.31325: variable 'omit' from source: magic vars 12033 1726867163.31329: starting attempt loop 12033 1726867163.31332: running the handler 12033 1726867163.31342: handler run complete 12033 1726867163.31353: attempt loop complete, returning result 12033 1726867163.31356: _execute() done 12033 1726867163.31358: dumping result to json 12033 1726867163.31361: done dumping result, returning 12033 1726867163.31363: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [0affcac9-a3a5-74bb-502b-000000000007] 12033 1726867163.31365: sending task result for task 0affcac9-a3a5-74bb-502b-000000000007 12033 1726867163.31438: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000007 12033 1726867163.31440: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 12033 1726867163.31513: no more pending results, returning what we have 12033 1726867163.31515: results queue empty 12033 1726867163.31516: checking for any_errors_fatal 12033 1726867163.31521: done checking for any_errors_fatal 12033 1726867163.31522: checking for max_fail_percentage 12033 1726867163.31523: done checking for max_fail_percentage 12033 1726867163.31524: checking to see if all hosts have failed and the running result is not ok 12033 1726867163.31524: done checking to see if all hosts have failed 12033 1726867163.31525: getting the remaining hosts for this loop 12033 1726867163.31526: done getting the remaining hosts for this loop 12033 1726867163.31528: getting the next task for host managed_node3 12033 1726867163.31532: done getting next task for host managed_node3 12033 1726867163.31533: ^ task is: TASK: meta (flush_handlers) 12033 1726867163.31535: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867163.31539: getting variables 12033 1726867163.31540: in VariableManager get_vars() 12033 1726867163.31559: Calling all_inventory to load vars for managed_node3 12033 1726867163.31560: Calling groups_inventory to load vars for managed_node3 12033 1726867163.31564: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867163.31572: Calling all_plugins_play to load vars for managed_node3 12033 1726867163.31573: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867163.31575: Calling groups_plugins_play to load vars for managed_node3 12033 1726867163.31672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867163.31780: done with get_vars() 12033 1726867163.31790: done getting variables 12033 1726867163.31831: in VariableManager get_vars() 12033 1726867163.31836: Calling all_inventory to load vars for managed_node3 12033 1726867163.31838: Calling groups_inventory to load vars for managed_node3 12033 1726867163.31839: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867163.31842: Calling all_plugins_play to load vars for managed_node3 12033 1726867163.31843: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867163.31845: Calling groups_plugins_play to load vars for managed_node3 12033 1726867163.31944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867163.32054: done with get_vars() 12033 1726867163.32063: done queuing things up, now waiting for results queue to drain 12033 1726867163.32064: results queue empty 12033 1726867163.32065: checking for any_errors_fatal 12033 1726867163.32066: done checking for any_errors_fatal 12033 1726867163.32066: checking for max_fail_percentage 12033 1726867163.32067: done checking for max_fail_percentage 12033 1726867163.32067: checking to see if all hosts have failed and the running result is not ok 12033 1726867163.32068: done checking to see if all hosts have failed 12033 1726867163.32068: getting the remaining hosts for this loop 12033 1726867163.32069: done getting the remaining hosts for this loop 12033 1726867163.32070: getting the next task for host managed_node3 12033 1726867163.32072: done getting next task for host managed_node3 12033 1726867163.32073: ^ task is: TASK: meta (flush_handlers) 12033 1726867163.32074: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867163.32081: getting variables 12033 1726867163.32081: in VariableManager get_vars() 12033 1726867163.32086: Calling all_inventory to load vars for managed_node3 12033 1726867163.32089: Calling groups_inventory to load vars for managed_node3 12033 1726867163.32091: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867163.32094: Calling all_plugins_play to load vars for managed_node3 12033 1726867163.32095: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867163.32097: Calling groups_plugins_play to load vars for managed_node3 12033 1726867163.32174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867163.32279: done with get_vars() 12033 1726867163.32285: done getting variables 12033 1726867163.32314: in VariableManager get_vars() 12033 1726867163.32319: Calling all_inventory to load vars for managed_node3 12033 1726867163.32320: Calling groups_inventory to load vars for managed_node3 12033 1726867163.32322: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867163.32326: Calling all_plugins_play to load vars for managed_node3 12033 1726867163.32328: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867163.32329: Calling groups_plugins_play to load vars for managed_node3 12033 1726867163.32408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867163.32527: done with get_vars() 12033 1726867163.32535: done queuing things up, now waiting for results queue to drain 12033 1726867163.32536: results queue empty 12033 1726867163.32537: checking for any_errors_fatal 12033 1726867163.32537: done checking for any_errors_fatal 12033 1726867163.32538: checking for max_fail_percentage 12033 1726867163.32538: done checking for max_fail_percentage 12033 1726867163.32539: checking to see if all hosts have failed and the running result is not ok 12033 1726867163.32540: done checking to see if all hosts have failed 12033 1726867163.32540: getting the remaining hosts for this loop 12033 1726867163.32541: done getting the remaining hosts for this loop 12033 1726867163.32543: getting the next task for host managed_node3 12033 1726867163.32545: done getting next task for host managed_node3 12033 1726867163.32546: ^ task is: None 12033 1726867163.32547: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867163.32548: done queuing things up, now waiting for results queue to drain 12033 1726867163.32548: results queue empty 12033 1726867163.32548: checking for any_errors_fatal 12033 1726867163.32549: done checking for any_errors_fatal 12033 1726867163.32549: checking for max_fail_percentage 12033 1726867163.32550: done checking for max_fail_percentage 12033 1726867163.32550: checking to see if all hosts have failed and the running result is not ok 12033 1726867163.32551: done checking to see if all hosts have failed 12033 1726867163.32552: getting the next task for host managed_node3 12033 1726867163.32553: done getting next task for host managed_node3 12033 1726867163.32553: ^ task is: None 12033 1726867163.32554: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867163.32591: in VariableManager get_vars() 12033 1726867163.32602: done with get_vars() 12033 1726867163.32605: in VariableManager get_vars() 12033 1726867163.32611: done with get_vars() 12033 1726867163.32613: variable 'omit' from source: magic vars 12033 1726867163.32631: in VariableManager get_vars() 12033 1726867163.32637: done with get_vars() 12033 1726867163.32648: variable 'omit' from source: magic vars PLAY [Play for testing bond options] ******************************************* 12033 1726867163.32795: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 12033 1726867163.32814: getting the remaining hosts for this loop 12033 1726867163.32815: done getting the remaining hosts for this loop 12033 1726867163.32817: getting the next task for host managed_node3 12033 1726867163.32819: done getting next task for host managed_node3 12033 1726867163.32820: ^ task is: TASK: Gathering Facts 12033 1726867163.32821: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867163.32822: getting variables 12033 1726867163.32823: in VariableManager get_vars() 12033 1726867163.32828: Calling all_inventory to load vars for managed_node3 12033 1726867163.32829: Calling groups_inventory to load vars for managed_node3 12033 1726867163.32830: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867163.32834: Calling all_plugins_play to load vars for managed_node3 12033 1726867163.32842: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867163.32844: Calling groups_plugins_play to load vars for managed_node3 12033 1726867163.32927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867163.33030: done with get_vars() 12033 1726867163.33036: done getting variables 12033 1726867163.33059: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:3 Friday 20 September 2024 17:19:23 -0400 (0:00:00.026) 0:00:02.446 ****** 12033 1726867163.33072: entering _queue_task() for managed_node3/gather_facts 12033 1726867163.33221: worker is 1 (out of 1 available) 12033 1726867163.33232: exiting _queue_task() for managed_node3/gather_facts 12033 1726867163.33246: done queuing things up, now waiting for results queue to drain 12033 1726867163.33247: waiting for pending results... 12033 1726867163.33358: running TaskExecutor() for managed_node3/TASK: Gathering Facts 12033 1726867163.33404: in run() - task 0affcac9-a3a5-74bb-502b-000000000071 12033 1726867163.33414: variable 'ansible_search_path' from source: unknown 12033 1726867163.33439: calling self._execute() 12033 1726867163.33485: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867163.33492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867163.33498: variable 'omit' from source: magic vars 12033 1726867163.33779: variable 'ansible_distribution_major_version' from source: facts 12033 1726867163.33790: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867163.33793: variable 'omit' from source: magic vars 12033 1726867163.33813: variable 'omit' from source: magic vars 12033 1726867163.33835: variable 'omit' from source: magic vars 12033 1726867163.33861: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867163.33886: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867163.33900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867163.33918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867163.33925: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867163.33945: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867163.33948: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867163.33951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867163.34012: Set connection var ansible_pipelining to False 12033 1726867163.34026: Set connection var ansible_shell_executable to /bin/sh 12033 1726867163.34029: Set connection var ansible_timeout to 10 12033 1726867163.34032: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867163.34035: Set connection var ansible_connection to ssh 12033 1726867163.34040: Set connection var ansible_shell_type to sh 12033 1726867163.34055: variable 'ansible_shell_executable' from source: unknown 12033 1726867163.34058: variable 'ansible_connection' from source: unknown 12033 1726867163.34060: variable 'ansible_module_compression' from source: unknown 12033 1726867163.34063: variable 'ansible_shell_type' from source: unknown 12033 1726867163.34065: variable 'ansible_shell_executable' from source: unknown 12033 1726867163.34067: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867163.34070: variable 'ansible_pipelining' from source: unknown 12033 1726867163.34072: variable 'ansible_timeout' from source: unknown 12033 1726867163.34076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867163.34196: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867163.34204: variable 'omit' from source: magic vars 12033 1726867163.34208: starting attempt loop 12033 1726867163.34211: running the handler 12033 1726867163.34222: variable 'ansible_facts' from source: unknown 12033 1726867163.34244: _low_level_execute_command(): starting 12033 1726867163.34247: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867163.34744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867163.34749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867163.34752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867163.34754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867163.34801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867163.34804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867163.34864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867163.36551: stdout chunk (state=3): >>>/root <<< 12033 1726867163.36650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867163.36681: stderr chunk (state=3): >>><<< 12033 1726867163.36684: stdout chunk (state=3): >>><<< 12033 1726867163.36703: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867163.36714: _low_level_execute_command(): starting 12033 1726867163.36718: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867163.3670273-12163-265817256506921 `" && echo ansible-tmp-1726867163.3670273-12163-265817256506921="` echo /root/.ansible/tmp/ansible-tmp-1726867163.3670273-12163-265817256506921 `" ) && sleep 0' 12033 1726867163.37154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867163.37157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867163.37159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867163.37168: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867163.37172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867163.37220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867163.37227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867163.37270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867163.39146: stdout chunk (state=3): >>>ansible-tmp-1726867163.3670273-12163-265817256506921=/root/.ansible/tmp/ansible-tmp-1726867163.3670273-12163-265817256506921 <<< 12033 1726867163.39383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867163.39387: stdout chunk (state=3): >>><<< 12033 1726867163.39389: stderr chunk (state=3): >>><<< 12033 1726867163.39392: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867163.3670273-12163-265817256506921=/root/.ansible/tmp/ansible-tmp-1726867163.3670273-12163-265817256506921 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867163.39395: variable 'ansible_module_compression' from source: unknown 12033 1726867163.39421: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12033 1726867163.39483: variable 'ansible_facts' from source: unknown 12033 1726867163.39710: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867163.3670273-12163-265817256506921/AnsiballZ_setup.py 12033 1726867163.39858: Sending initial data 12033 1726867163.39868: Sent initial data (154 bytes) 12033 1726867163.40584: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867163.40605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867163.40685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867163.40721: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867163.40794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867163.42309: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 12033 1726867163.42313: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867163.42354: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867163.42402: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpjqgkoq7l /root/.ansible/tmp/ansible-tmp-1726867163.3670273-12163-265817256506921/AnsiballZ_setup.py <<< 12033 1726867163.42405: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867163.3670273-12163-265817256506921/AnsiballZ_setup.py" <<< 12033 1726867163.42450: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpjqgkoq7l" to remote "/root/.ansible/tmp/ansible-tmp-1726867163.3670273-12163-265817256506921/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867163.3670273-12163-265817256506921/AnsiballZ_setup.py" <<< 12033 1726867163.43481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867163.43513: stderr chunk (state=3): >>><<< 12033 1726867163.43516: stdout chunk (state=3): >>><<< 12033 1726867163.43532: done transferring module to remote 12033 1726867163.43543: _low_level_execute_command(): starting 12033 1726867163.43546: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867163.3670273-12163-265817256506921/ /root/.ansible/tmp/ansible-tmp-1726867163.3670273-12163-265817256506921/AnsiballZ_setup.py && sleep 0' 12033 1726867163.43935: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867163.43939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867163.43950: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867163.44010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867163.44017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867163.44019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867163.44059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867163.45790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867163.45811: stderr chunk (state=3): >>><<< 12033 1726867163.45814: stdout chunk (state=3): >>><<< 12033 1726867163.45827: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867163.45830: _low_level_execute_command(): starting 12033 1726867163.45832: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867163.3670273-12163-265817256506921/AnsiballZ_setup.py && sleep 0' 12033 1726867163.46220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867163.46223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867163.46225: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867163.46229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867163.46268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867163.46289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867163.46335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867164.10337: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7uUwLUrAgQyz7a8YAgUBvVYqUHIXrD9OD4IdRIvS0wM5DjkrqZknN+lxTMZpMWg/jlXFJVvFXYt0TDIxUv3VMQ7CG9CyYmBbeuJEWvoYwb8DuKGoQjvaw9hPa0a/tdKQRUk5Ee48tJDb1/f7b8HC6W47zMa508De0NKmJpkUCxPSiwETfkKtSFi1NU3yedKOlKSYO4jtNZMDSixlZgDT5la3jcB1k7FimMu61ZL4YdRdqowrsERzsKoyoubw2+euaXWxsKU9sxogT2uxy65PoA58KxP/BEqzQxzR9t9sEvGNVBRBcuBPyFKAEMwdm8wwEuHftGIX6HVD1ZyJ1kV94Sw1QBrBKVYLOc0F2Vfxah2KpheJtfxHN+3Y3VDCJCkheMOUfJL9Uq80f2+8xs3fb05mdaTabyPG6tsrK36M4NCUEwR/rlJ3z1xlUO5AQ7JnNr6OrRQTCXiZXYW8yubiTXlPYBD02/Zw1skEHGR9bVLEEd//GNW0z8DiLO9vRib8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFKa0veb+8P6VFgxqYEtIVaL2y6+Ja4kI5pG6tlGueD6mqrC1AYcokgYEcDSMDOhGEqO5Njf6G9zjcAWiPgmZds=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIE2riHWdppRksv40oyHGkAt2lseuRiuwNlSobn5rl+/f", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-68", "ansible_nodename": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24e9df8b51e91cc3587e46253f155b", "ansible_loadavg": {"1m": 0.41943359375, "5m": 0.2890625, "15m": 0.1416015625}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "19", "second": "23", "epoch": "1726867163", "epoch_int": "1726867163", "date": "2024-09-20", "time": "17:19:23", "iso8601_micro": "2024-09-20T21:19:23.737630Z", "iso8601": "2024-09-20T21:19:23Z", "iso8601_basic": "20240920T171923737630", "iso8601_basic_short": "20240920T171923", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": t<<< 12033 1726867164.10381: stdout chunk (state=3): >>>rue, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 49840 10.31.15.68 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 49840 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2987, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 544, "free": 2987}, "nocache": {"free": 3301, "used": 230}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24e9df-8b51-e91c-c358-7e46253f155b", "ansible_product_uuid": "ec24e9df-8b51-e91c-c358-7e46253f155b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 400, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805588480, "block_size": 4096, "block_total": 65519099, "block_available": 63917380, "block_used": 1601719, "inode_total": 131070960, "inode_available": 131029138, "inode_used": 41822, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": fal<<< 12033 1726867164.10392: stdout chunk (state=3): >>>se, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:de:45:ad:8b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.68", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:deff:fe45:ad8b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "<<< 12033 1726867164.10408: stdout chunk (state=3): >>>off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.68", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:de:45:ad:8b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.68"], "ansible_all_ipv6_addresses": ["fe80::8ff:deff:fe45:ad8b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.68", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:deff:fe45:ad8b"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12033 1726867164.12276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867164.12305: stderr chunk (state=3): >>><<< 12033 1726867164.12308: stdout chunk (state=3): >>><<< 12033 1726867164.12334: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7uUwLUrAgQyz7a8YAgUBvVYqUHIXrD9OD4IdRIvS0wM5DjkrqZknN+lxTMZpMWg/jlXFJVvFXYt0TDIxUv3VMQ7CG9CyYmBbeuJEWvoYwb8DuKGoQjvaw9hPa0a/tdKQRUk5Ee48tJDb1/f7b8HC6W47zMa508De0NKmJpkUCxPSiwETfkKtSFi1NU3yedKOlKSYO4jtNZMDSixlZgDT5la3jcB1k7FimMu61ZL4YdRdqowrsERzsKoyoubw2+euaXWxsKU9sxogT2uxy65PoA58KxP/BEqzQxzR9t9sEvGNVBRBcuBPyFKAEMwdm8wwEuHftGIX6HVD1ZyJ1kV94Sw1QBrBKVYLOc0F2Vfxah2KpheJtfxHN+3Y3VDCJCkheMOUfJL9Uq80f2+8xs3fb05mdaTabyPG6tsrK36M4NCUEwR/rlJ3z1xlUO5AQ7JnNr6OrRQTCXiZXYW8yubiTXlPYBD02/Zw1skEHGR9bVLEEd//GNW0z8DiLO9vRib8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFKa0veb+8P6VFgxqYEtIVaL2y6+Ja4kI5pG6tlGueD6mqrC1AYcokgYEcDSMDOhGEqO5Njf6G9zjcAWiPgmZds=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIE2riHWdppRksv40oyHGkAt2lseuRiuwNlSobn5rl+/f", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-68", "ansible_nodename": "ip-10-31-15-68.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24e9df8b51e91cc3587e46253f155b", "ansible_loadavg": {"1m": 0.41943359375, "5m": 0.2890625, "15m": 0.1416015625}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "19", "second": "23", "epoch": "1726867163", "epoch_int": "1726867163", "date": "2024-09-20", "time": "17:19:23", "iso8601_micro": "2024-09-20T21:19:23.737630Z", "iso8601": "2024-09-20T21:19:23Z", "iso8601_basic": "20240920T171923737630", "iso8601_basic_short": "20240920T171923", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 49840 10.31.15.68 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 49840 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2987, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 544, "free": 2987}, "nocache": {"free": 3301, "used": 230}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24e9df-8b51-e91c-c358-7e46253f155b", "ansible_product_uuid": "ec24e9df-8b51-e91c-c358-7e46253f155b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 400, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805588480, "block_size": 4096, "block_total": 65519099, "block_available": 63917380, "block_used": 1601719, "inode_total": 131070960, "inode_available": 131029138, "inode_used": 41822, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:de:45:ad:8b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.68", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:deff:fe45:ad8b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.68", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:de:45:ad:8b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.68"], "ansible_all_ipv6_addresses": ["fe80::8ff:deff:fe45:ad8b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.68", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:deff:fe45:ad8b"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867164.12534: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867163.3670273-12163-265817256506921/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867164.12551: _low_level_execute_command(): starting 12033 1726867164.12556: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867163.3670273-12163-265817256506921/ > /dev/null 2>&1 && sleep 0' 12033 1726867164.12995: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867164.12998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867164.13000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867164.13002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867164.13004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867164.13051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867164.13055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867164.13105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867164.14912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867164.14933: stderr chunk (state=3): >>><<< 12033 1726867164.14936: stdout chunk (state=3): >>><<< 12033 1726867164.14948: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867164.14958: handler run complete 12033 1726867164.15030: variable 'ansible_facts' from source: unknown 12033 1726867164.15093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867164.15274: variable 'ansible_facts' from source: unknown 12033 1726867164.15329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867164.15406: attempt loop complete, returning result 12033 1726867164.15409: _execute() done 12033 1726867164.15411: dumping result to json 12033 1726867164.15430: done dumping result, returning 12033 1726867164.15437: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affcac9-a3a5-74bb-502b-000000000071] 12033 1726867164.15440: sending task result for task 0affcac9-a3a5-74bb-502b-000000000071 12033 1726867164.15722: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000071 12033 1726867164.15724: WORKER PROCESS EXITING ok: [managed_node3] 12033 1726867164.15914: no more pending results, returning what we have 12033 1726867164.15916: results queue empty 12033 1726867164.15917: checking for any_errors_fatal 12033 1726867164.15918: done checking for any_errors_fatal 12033 1726867164.15918: checking for max_fail_percentage 12033 1726867164.15919: done checking for max_fail_percentage 12033 1726867164.15920: checking to see if all hosts have failed and the running result is not ok 12033 1726867164.15920: done checking to see if all hosts have failed 12033 1726867164.15921: getting the remaining hosts for this loop 12033 1726867164.15921: done getting the remaining hosts for this loop 12033 1726867164.15924: getting the next task for host managed_node3 12033 1726867164.15927: done getting next task for host managed_node3 12033 1726867164.15929: ^ task is: TASK: meta (flush_handlers) 12033 1726867164.15930: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867164.15932: getting variables 12033 1726867164.15933: in VariableManager get_vars() 12033 1726867164.15949: Calling all_inventory to load vars for managed_node3 12033 1726867164.15951: Calling groups_inventory to load vars for managed_node3 12033 1726867164.15954: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867164.15961: Calling all_plugins_play to load vars for managed_node3 12033 1726867164.15962: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867164.15964: Calling groups_plugins_play to load vars for managed_node3 12033 1726867164.16060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867164.16170: done with get_vars() 12033 1726867164.16178: done getting variables 12033 1726867164.16226: in VariableManager get_vars() 12033 1726867164.16232: Calling all_inventory to load vars for managed_node3 12033 1726867164.16234: Calling groups_inventory to load vars for managed_node3 12033 1726867164.16235: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867164.16238: Calling all_plugins_play to load vars for managed_node3 12033 1726867164.16239: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867164.16241: Calling groups_plugins_play to load vars for managed_node3 12033 1726867164.16323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867164.16441: done with get_vars() 12033 1726867164.16449: done queuing things up, now waiting for results queue to drain 12033 1726867164.16450: results queue empty 12033 1726867164.16451: checking for any_errors_fatal 12033 1726867164.16453: done checking for any_errors_fatal 12033 1726867164.16453: checking for max_fail_percentage 12033 1726867164.16454: done checking for max_fail_percentage 12033 1726867164.16458: checking to see if all hosts have failed and the running result is not ok 12033 1726867164.16458: done checking to see if all hosts have failed 12033 1726867164.16459: getting the remaining hosts for this loop 12033 1726867164.16459: done getting the remaining hosts for this loop 12033 1726867164.16461: getting the next task for host managed_node3 12033 1726867164.16463: done getting next task for host managed_node3 12033 1726867164.16465: ^ task is: TASK: Show playbook name 12033 1726867164.16466: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867164.16467: getting variables 12033 1726867164.16467: in VariableManager get_vars() 12033 1726867164.16472: Calling all_inventory to load vars for managed_node3 12033 1726867164.16474: Calling groups_inventory to load vars for managed_node3 12033 1726867164.16475: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867164.16480: Calling all_plugins_play to load vars for managed_node3 12033 1726867164.16481: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867164.16483: Calling groups_plugins_play to load vars for managed_node3 12033 1726867164.16561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867164.16666: done with get_vars() 12033 1726867164.16672: done getting variables 12033 1726867164.16729: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show playbook name] ****************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:32 Friday 20 September 2024 17:19:24 -0400 (0:00:00.836) 0:00:03.283 ****** 12033 1726867164.16746: entering _queue_task() for managed_node3/debug 12033 1726867164.16747: Creating lock for debug 12033 1726867164.16949: worker is 1 (out of 1 available) 12033 1726867164.16961: exiting _queue_task() for managed_node3/debug 12033 1726867164.16972: done queuing things up, now waiting for results queue to drain 12033 1726867164.16974: waiting for pending results... 12033 1726867164.17117: running TaskExecutor() for managed_node3/TASK: Show playbook name 12033 1726867164.17165: in run() - task 0affcac9-a3a5-74bb-502b-00000000000b 12033 1726867164.17175: variable 'ansible_search_path' from source: unknown 12033 1726867164.17209: calling self._execute() 12033 1726867164.17259: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.17263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.17272: variable 'omit' from source: magic vars 12033 1726867164.17534: variable 'ansible_distribution_major_version' from source: facts 12033 1726867164.17542: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867164.17545: variable 'omit' from source: magic vars 12033 1726867164.17564: variable 'omit' from source: magic vars 12033 1726867164.17592: variable 'omit' from source: magic vars 12033 1726867164.17620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867164.17651: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867164.17663: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867164.17676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.17690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.17711: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867164.17713: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.17718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.17790: Set connection var ansible_pipelining to False 12033 1726867164.17795: Set connection var ansible_shell_executable to /bin/sh 12033 1726867164.17802: Set connection var ansible_timeout to 10 12033 1726867164.17807: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867164.17809: Set connection var ansible_connection to ssh 12033 1726867164.17814: Set connection var ansible_shell_type to sh 12033 1726867164.17830: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.17833: variable 'ansible_connection' from source: unknown 12033 1726867164.17835: variable 'ansible_module_compression' from source: unknown 12033 1726867164.17838: variable 'ansible_shell_type' from source: unknown 12033 1726867164.17840: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.17842: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.17846: variable 'ansible_pipelining' from source: unknown 12033 1726867164.17848: variable 'ansible_timeout' from source: unknown 12033 1726867164.17854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.17949: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867164.17956: variable 'omit' from source: magic vars 12033 1726867164.17961: starting attempt loop 12033 1726867164.17966: running the handler 12033 1726867164.18003: handler run complete 12033 1726867164.18018: attempt loop complete, returning result 12033 1726867164.18021: _execute() done 12033 1726867164.18024: dumping result to json 12033 1726867164.18026: done dumping result, returning 12033 1726867164.18032: done running TaskExecutor() for managed_node3/TASK: Show playbook name [0affcac9-a3a5-74bb-502b-00000000000b] 12033 1726867164.18036: sending task result for task 0affcac9-a3a5-74bb-502b-00000000000b 12033 1726867164.18115: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000000b 12033 1726867164.18118: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: this is: playbooks/tests_bond_options.yml 12033 1726867164.18161: no more pending results, returning what we have 12033 1726867164.18165: results queue empty 12033 1726867164.18166: checking for any_errors_fatal 12033 1726867164.18167: done checking for any_errors_fatal 12033 1726867164.18167: checking for max_fail_percentage 12033 1726867164.18169: done checking for max_fail_percentage 12033 1726867164.18169: checking to see if all hosts have failed and the running result is not ok 12033 1726867164.18170: done checking to see if all hosts have failed 12033 1726867164.18171: getting the remaining hosts for this loop 12033 1726867164.18172: done getting the remaining hosts for this loop 12033 1726867164.18175: getting the next task for host managed_node3 12033 1726867164.18184: done getting next task for host managed_node3 12033 1726867164.18186: ^ task is: TASK: Include the task 'run_test.yml' 12033 1726867164.18190: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867164.18193: getting variables 12033 1726867164.18194: in VariableManager get_vars() 12033 1726867164.18216: Calling all_inventory to load vars for managed_node3 12033 1726867164.18218: Calling groups_inventory to load vars for managed_node3 12033 1726867164.18221: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867164.18234: Calling all_plugins_play to load vars for managed_node3 12033 1726867164.18237: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867164.18240: Calling groups_plugins_play to load vars for managed_node3 12033 1726867164.18375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867164.18487: done with get_vars() 12033 1726867164.18495: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:42 Friday 20 September 2024 17:19:24 -0400 (0:00:00.017) 0:00:03.301 ****** 12033 1726867164.18548: entering _queue_task() for managed_node3/include_tasks 12033 1726867164.18720: worker is 1 (out of 1 available) 12033 1726867164.18731: exiting _queue_task() for managed_node3/include_tasks 12033 1726867164.18742: done queuing things up, now waiting for results queue to drain 12033 1726867164.18743: waiting for pending results... 12033 1726867164.18869: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 12033 1726867164.18920: in run() - task 0affcac9-a3a5-74bb-502b-00000000000d 12033 1726867164.18930: variable 'ansible_search_path' from source: unknown 12033 1726867164.18955: calling self._execute() 12033 1726867164.19015: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.19019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.19022: variable 'omit' from source: magic vars 12033 1726867164.19254: variable 'ansible_distribution_major_version' from source: facts 12033 1726867164.19263: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867164.19268: _execute() done 12033 1726867164.19271: dumping result to json 12033 1726867164.19273: done dumping result, returning 12033 1726867164.19281: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [0affcac9-a3a5-74bb-502b-00000000000d] 12033 1726867164.19286: sending task result for task 0affcac9-a3a5-74bb-502b-00000000000d 12033 1726867164.19382: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000000d 12033 1726867164.19385: WORKER PROCESS EXITING 12033 1726867164.19434: no more pending results, returning what we have 12033 1726867164.19438: in VariableManager get_vars() 12033 1726867164.19461: Calling all_inventory to load vars for managed_node3 12033 1726867164.19463: Calling groups_inventory to load vars for managed_node3 12033 1726867164.19465: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867164.19471: Calling all_plugins_play to load vars for managed_node3 12033 1726867164.19473: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867164.19474: Calling groups_plugins_play to load vars for managed_node3 12033 1726867164.19576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867164.19686: done with get_vars() 12033 1726867164.19693: variable 'ansible_search_path' from source: unknown 12033 1726867164.19702: we have included files to process 12033 1726867164.19702: generating all_blocks data 12033 1726867164.19703: done generating all_blocks data 12033 1726867164.19704: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 12033 1726867164.19704: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 12033 1726867164.19706: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 12033 1726867164.20037: in VariableManager get_vars() 12033 1726867164.20049: done with get_vars() 12033 1726867164.20074: in VariableManager get_vars() 12033 1726867164.20085: done with get_vars() 12033 1726867164.20112: in VariableManager get_vars() 12033 1726867164.20120: done with get_vars() 12033 1726867164.20143: in VariableManager get_vars() 12033 1726867164.20152: done with get_vars() 12033 1726867164.20190: in VariableManager get_vars() 12033 1726867164.20199: done with get_vars() 12033 1726867164.20425: in VariableManager get_vars() 12033 1726867164.20435: done with get_vars() 12033 1726867164.20442: done processing included file 12033 1726867164.20443: iterating over new_blocks loaded from include file 12033 1726867164.20444: in VariableManager get_vars() 12033 1726867164.20450: done with get_vars() 12033 1726867164.20451: filtering new block on tags 12033 1726867164.20514: done filtering new block on tags 12033 1726867164.20516: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 12033 1726867164.20519: extending task lists for all hosts with included blocks 12033 1726867164.20539: done extending task lists 12033 1726867164.20540: done processing included files 12033 1726867164.20540: results queue empty 12033 1726867164.20541: checking for any_errors_fatal 12033 1726867164.20543: done checking for any_errors_fatal 12033 1726867164.20543: checking for max_fail_percentage 12033 1726867164.20544: done checking for max_fail_percentage 12033 1726867164.20545: checking to see if all hosts have failed and the running result is not ok 12033 1726867164.20545: done checking to see if all hosts have failed 12033 1726867164.20546: getting the remaining hosts for this loop 12033 1726867164.20546: done getting the remaining hosts for this loop 12033 1726867164.20548: getting the next task for host managed_node3 12033 1726867164.20550: done getting next task for host managed_node3 12033 1726867164.20551: ^ task is: TASK: TEST: {{ lsr_description }} 12033 1726867164.20553: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867164.20554: getting variables 12033 1726867164.20555: in VariableManager get_vars() 12033 1726867164.20560: Calling all_inventory to load vars for managed_node3 12033 1726867164.20561: Calling groups_inventory to load vars for managed_node3 12033 1726867164.20562: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867164.20566: Calling all_plugins_play to load vars for managed_node3 12033 1726867164.20567: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867164.20568: Calling groups_plugins_play to load vars for managed_node3 12033 1726867164.20665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867164.20771: done with get_vars() 12033 1726867164.20779: done getting variables 12033 1726867164.20806: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867164.20882: variable 'lsr_description' from source: include params TASK [TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 17:19:24 -0400 (0:00:00.023) 0:00:03.324 ****** 12033 1726867164.20911: entering _queue_task() for managed_node3/debug 12033 1726867164.21083: worker is 1 (out of 1 available) 12033 1726867164.21097: exiting _queue_task() for managed_node3/debug 12033 1726867164.21107: done queuing things up, now waiting for results queue to drain 12033 1726867164.21109: waiting for pending results... 12033 1726867164.21238: running TaskExecutor() for managed_node3/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. 12033 1726867164.21295: in run() - task 0affcac9-a3a5-74bb-502b-000000000088 12033 1726867164.21303: variable 'ansible_search_path' from source: unknown 12033 1726867164.21307: variable 'ansible_search_path' from source: unknown 12033 1726867164.21332: calling self._execute() 12033 1726867164.21382: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.21391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.21397: variable 'omit' from source: magic vars 12033 1726867164.21622: variable 'ansible_distribution_major_version' from source: facts 12033 1726867164.21631: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867164.21637: variable 'omit' from source: magic vars 12033 1726867164.21659: variable 'omit' from source: magic vars 12033 1726867164.21729: variable 'lsr_description' from source: include params 12033 1726867164.21743: variable 'omit' from source: magic vars 12033 1726867164.21771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867164.21802: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867164.21815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867164.21827: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.21836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.21857: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867164.21860: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.21864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.21932: Set connection var ansible_pipelining to False 12033 1726867164.21939: Set connection var ansible_shell_executable to /bin/sh 12033 1726867164.21946: Set connection var ansible_timeout to 10 12033 1726867164.21951: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867164.21953: Set connection var ansible_connection to ssh 12033 1726867164.21958: Set connection var ansible_shell_type to sh 12033 1726867164.21973: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.21976: variable 'ansible_connection' from source: unknown 12033 1726867164.21980: variable 'ansible_module_compression' from source: unknown 12033 1726867164.21982: variable 'ansible_shell_type' from source: unknown 12033 1726867164.21985: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.21987: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.21991: variable 'ansible_pipelining' from source: unknown 12033 1726867164.21994: variable 'ansible_timeout' from source: unknown 12033 1726867164.21997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.22091: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867164.22096: variable 'omit' from source: magic vars 12033 1726867164.22101: starting attempt loop 12033 1726867164.22104: running the handler 12033 1726867164.22138: handler run complete 12033 1726867164.22148: attempt loop complete, returning result 12033 1726867164.22150: _execute() done 12033 1726867164.22153: dumping result to json 12033 1726867164.22156: done dumping result, returning 12033 1726867164.22161: done running TaskExecutor() for managed_node3/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. [0affcac9-a3a5-74bb-502b-000000000088] 12033 1726867164.22166: sending task result for task 0affcac9-a3a5-74bb-502b-000000000088 12033 1726867164.22248: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000088 12033 1726867164.22251: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ########## Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. ########## 12033 1726867164.22297: no more pending results, returning what we have 12033 1726867164.22300: results queue empty 12033 1726867164.22301: checking for any_errors_fatal 12033 1726867164.22302: done checking for any_errors_fatal 12033 1726867164.22302: checking for max_fail_percentage 12033 1726867164.22304: done checking for max_fail_percentage 12033 1726867164.22304: checking to see if all hosts have failed and the running result is not ok 12033 1726867164.22305: done checking to see if all hosts have failed 12033 1726867164.22306: getting the remaining hosts for this loop 12033 1726867164.22307: done getting the remaining hosts for this loop 12033 1726867164.22310: getting the next task for host managed_node3 12033 1726867164.22314: done getting next task for host managed_node3 12033 1726867164.22316: ^ task is: TASK: Show item 12033 1726867164.22319: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867164.22321: getting variables 12033 1726867164.22323: in VariableManager get_vars() 12033 1726867164.22343: Calling all_inventory to load vars for managed_node3 12033 1726867164.22345: Calling groups_inventory to load vars for managed_node3 12033 1726867164.22348: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867164.22355: Calling all_plugins_play to load vars for managed_node3 12033 1726867164.22357: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867164.22362: Calling groups_plugins_play to load vars for managed_node3 12033 1726867164.22464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867164.22573: done with get_vars() 12033 1726867164.22583: done getting variables 12033 1726867164.22618: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 17:19:24 -0400 (0:00:00.017) 0:00:03.342 ****** 12033 1726867164.22638: entering _queue_task() for managed_node3/debug 12033 1726867164.22801: worker is 1 (out of 1 available) 12033 1726867164.22812: exiting _queue_task() for managed_node3/debug 12033 1726867164.22824: done queuing things up, now waiting for results queue to drain 12033 1726867164.22825: waiting for pending results... 12033 1726867164.22946: running TaskExecutor() for managed_node3/TASK: Show item 12033 1726867164.22998: in run() - task 0affcac9-a3a5-74bb-502b-000000000089 12033 1726867164.23009: variable 'ansible_search_path' from source: unknown 12033 1726867164.23012: variable 'ansible_search_path' from source: unknown 12033 1726867164.23047: variable 'omit' from source: magic vars 12033 1726867164.23129: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.23136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.23144: variable 'omit' from source: magic vars 12033 1726867164.23555: variable 'ansible_distribution_major_version' from source: facts 12033 1726867164.23564: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867164.23569: variable 'omit' from source: magic vars 12033 1726867164.23595: variable 'omit' from source: magic vars 12033 1726867164.23626: variable 'item' from source: unknown 12033 1726867164.23675: variable 'item' from source: unknown 12033 1726867164.23693: variable 'omit' from source: magic vars 12033 1726867164.23724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867164.23745: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867164.23759: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867164.23772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.23781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.23803: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867164.23806: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.23809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.23873: Set connection var ansible_pipelining to False 12033 1726867164.23881: Set connection var ansible_shell_executable to /bin/sh 12033 1726867164.23891: Set connection var ansible_timeout to 10 12033 1726867164.23893: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867164.23896: Set connection var ansible_connection to ssh 12033 1726867164.23900: Set connection var ansible_shell_type to sh 12033 1726867164.23916: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.23919: variable 'ansible_connection' from source: unknown 12033 1726867164.23921: variable 'ansible_module_compression' from source: unknown 12033 1726867164.23924: variable 'ansible_shell_type' from source: unknown 12033 1726867164.23926: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.23928: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.23932: variable 'ansible_pipelining' from source: unknown 12033 1726867164.23943: variable 'ansible_timeout' from source: unknown 12033 1726867164.23946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.24029: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867164.24037: variable 'omit' from source: magic vars 12033 1726867164.24044: starting attempt loop 12033 1726867164.24046: running the handler 12033 1726867164.24080: variable 'lsr_description' from source: include params 12033 1726867164.24123: variable 'lsr_description' from source: include params 12033 1726867164.24130: handler run complete 12033 1726867164.24143: attempt loop complete, returning result 12033 1726867164.24154: variable 'item' from source: unknown 12033 1726867164.24200: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device." } 12033 1726867164.24323: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.24327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.24329: variable 'omit' from source: magic vars 12033 1726867164.24400: variable 'ansible_distribution_major_version' from source: facts 12033 1726867164.24404: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867164.24408: variable 'omit' from source: magic vars 12033 1726867164.24419: variable 'omit' from source: magic vars 12033 1726867164.24449: variable 'item' from source: unknown 12033 1726867164.24492: variable 'item' from source: unknown 12033 1726867164.24501: variable 'omit' from source: magic vars 12033 1726867164.24515: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867164.24522: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.24528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.24537: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867164.24539: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.24542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.24592: Set connection var ansible_pipelining to False 12033 1726867164.24595: Set connection var ansible_shell_executable to /bin/sh 12033 1726867164.24602: Set connection var ansible_timeout to 10 12033 1726867164.24606: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867164.24609: Set connection var ansible_connection to ssh 12033 1726867164.24614: Set connection var ansible_shell_type to sh 12033 1726867164.24627: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.24630: variable 'ansible_connection' from source: unknown 12033 1726867164.24632: variable 'ansible_module_compression' from source: unknown 12033 1726867164.24635: variable 'ansible_shell_type' from source: unknown 12033 1726867164.24637: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.24639: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.24643: variable 'ansible_pipelining' from source: unknown 12033 1726867164.24646: variable 'ansible_timeout' from source: unknown 12033 1726867164.24650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.24709: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867164.24717: variable 'omit' from source: magic vars 12033 1726867164.24719: starting attempt loop 12033 1726867164.24722: running the handler 12033 1726867164.24738: variable 'lsr_setup' from source: include params 12033 1726867164.24785: variable 'lsr_setup' from source: include params 12033 1726867164.24818: handler run complete 12033 1726867164.24830: attempt loop complete, returning result 12033 1726867164.24842: variable 'item' from source: unknown 12033 1726867164.24887: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_test_interfaces_with_dhcp.yml", "tasks/assert_dhcp_device_present.yml" ] } 12033 1726867164.24957: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.24960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.24966: variable 'omit' from source: magic vars 12033 1726867164.25060: variable 'ansible_distribution_major_version' from source: facts 12033 1726867164.25064: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867164.25066: variable 'omit' from source: magic vars 12033 1726867164.25079: variable 'omit' from source: magic vars 12033 1726867164.25108: variable 'item' from source: unknown 12033 1726867164.25149: variable 'item' from source: unknown 12033 1726867164.25159: variable 'omit' from source: magic vars 12033 1726867164.25172: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867164.25180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.25185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.25195: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867164.25198: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.25207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.25246: Set connection var ansible_pipelining to False 12033 1726867164.25252: Set connection var ansible_shell_executable to /bin/sh 12033 1726867164.25258: Set connection var ansible_timeout to 10 12033 1726867164.25263: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867164.25265: Set connection var ansible_connection to ssh 12033 1726867164.25270: Set connection var ansible_shell_type to sh 12033 1726867164.25284: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.25287: variable 'ansible_connection' from source: unknown 12033 1726867164.25292: variable 'ansible_module_compression' from source: unknown 12033 1726867164.25294: variable 'ansible_shell_type' from source: unknown 12033 1726867164.25296: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.25299: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.25301: variable 'ansible_pipelining' from source: unknown 12033 1726867164.25303: variable 'ansible_timeout' from source: unknown 12033 1726867164.25305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.25361: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867164.25367: variable 'omit' from source: magic vars 12033 1726867164.25370: starting attempt loop 12033 1726867164.25373: running the handler 12033 1726867164.25392: variable 'lsr_test' from source: include params 12033 1726867164.25434: variable 'lsr_test' from source: include params 12033 1726867164.25446: handler run complete 12033 1726867164.25455: attempt loop complete, returning result 12033 1726867164.25466: variable 'item' from source: unknown 12033 1726867164.25510: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bond_profile.yml" ] } 12033 1726867164.25579: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.25582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.25595: variable 'omit' from source: magic vars 12033 1726867164.25681: variable 'ansible_distribution_major_version' from source: facts 12033 1726867164.25685: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867164.25693: variable 'omit' from source: magic vars 12033 1726867164.25702: variable 'omit' from source: magic vars 12033 1726867164.25730: variable 'item' from source: unknown 12033 1726867164.25771: variable 'item' from source: unknown 12033 1726867164.25784: variable 'omit' from source: magic vars 12033 1726867164.25802: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867164.25809: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.25818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.25825: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867164.25827: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.25830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.25871: Set connection var ansible_pipelining to False 12033 1726867164.25878: Set connection var ansible_shell_executable to /bin/sh 12033 1726867164.25885: Set connection var ansible_timeout to 10 12033 1726867164.25892: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867164.25895: Set connection var ansible_connection to ssh 12033 1726867164.25899: Set connection var ansible_shell_type to sh 12033 1726867164.25914: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.25916: variable 'ansible_connection' from source: unknown 12033 1726867164.25920: variable 'ansible_module_compression' from source: unknown 12033 1726867164.25922: variable 'ansible_shell_type' from source: unknown 12033 1726867164.25924: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.25927: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.25929: variable 'ansible_pipelining' from source: unknown 12033 1726867164.25931: variable 'ansible_timeout' from source: unknown 12033 1726867164.25933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.25990: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867164.25998: variable 'omit' from source: magic vars 12033 1726867164.26001: starting attempt loop 12033 1726867164.26003: running the handler 12033 1726867164.26017: variable 'lsr_assert' from source: include params 12033 1726867164.26064: variable 'lsr_assert' from source: include params 12033 1726867164.26078: handler run complete 12033 1726867164.26088: attempt loop complete, returning result 12033 1726867164.26101: variable 'item' from source: unknown 12033 1726867164.26142: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_controller_device_present.yml", "tasks/assert_bond_port_profile_present.yml", "tasks/assert_bond_options.yml" ] } 12033 1726867164.26216: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.26219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.26226: variable 'omit' from source: magic vars 12033 1726867164.26321: variable 'ansible_distribution_major_version' from source: facts 12033 1726867164.26324: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867164.26327: variable 'omit' from source: magic vars 12033 1726867164.26338: variable 'omit' from source: magic vars 12033 1726867164.26363: variable 'item' from source: unknown 12033 1726867164.26411: variable 'item' from source: unknown 12033 1726867164.26421: variable 'omit' from source: magic vars 12033 1726867164.26435: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867164.26441: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.26447: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.26455: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867164.26458: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.26460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.26509: Set connection var ansible_pipelining to False 12033 1726867164.26515: Set connection var ansible_shell_executable to /bin/sh 12033 1726867164.26521: Set connection var ansible_timeout to 10 12033 1726867164.26526: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867164.26528: Set connection var ansible_connection to ssh 12033 1726867164.26533: Set connection var ansible_shell_type to sh 12033 1726867164.26546: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.26549: variable 'ansible_connection' from source: unknown 12033 1726867164.26551: variable 'ansible_module_compression' from source: unknown 12033 1726867164.26553: variable 'ansible_shell_type' from source: unknown 12033 1726867164.26555: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.26558: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.26562: variable 'ansible_pipelining' from source: unknown 12033 1726867164.26564: variable 'ansible_timeout' from source: unknown 12033 1726867164.26568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.26628: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867164.26634: variable 'omit' from source: magic vars 12033 1726867164.26637: starting attempt loop 12033 1726867164.26640: running the handler 12033 1726867164.26718: handler run complete 12033 1726867164.26726: attempt loop complete, returning result 12033 1726867164.26737: variable 'item' from source: unknown 12033 1726867164.26780: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 12033 1726867164.27072: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.27075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.27079: variable 'omit' from source: magic vars 12033 1726867164.27082: variable 'ansible_distribution_major_version' from source: facts 12033 1726867164.27084: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867164.27092: variable 'omit' from source: magic vars 12033 1726867164.27095: variable 'omit' from source: magic vars 12033 1726867164.27098: variable 'item' from source: unknown 12033 1726867164.27116: variable 'item' from source: unknown 12033 1726867164.27127: variable 'omit' from source: magic vars 12033 1726867164.27139: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867164.27146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.27151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.27160: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867164.27162: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.27165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.27214: Set connection var ansible_pipelining to False 12033 1726867164.27219: Set connection var ansible_shell_executable to /bin/sh 12033 1726867164.27226: Set connection var ansible_timeout to 10 12033 1726867164.27231: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867164.27234: Set connection var ansible_connection to ssh 12033 1726867164.27239: Set connection var ansible_shell_type to sh 12033 1726867164.27251: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.27254: variable 'ansible_connection' from source: unknown 12033 1726867164.27256: variable 'ansible_module_compression' from source: unknown 12033 1726867164.27258: variable 'ansible_shell_type' from source: unknown 12033 1726867164.27261: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.27263: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.27267: variable 'ansible_pipelining' from source: unknown 12033 1726867164.27269: variable 'ansible_timeout' from source: unknown 12033 1726867164.27273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.27334: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867164.27340: variable 'omit' from source: magic vars 12033 1726867164.27343: starting attempt loop 12033 1726867164.27345: running the handler 12033 1726867164.27359: variable 'lsr_fail_debug' from source: play vars 12033 1726867164.27405: variable 'lsr_fail_debug' from source: play vars 12033 1726867164.27420: handler run complete 12033 1726867164.27429: attempt loop complete, returning result 12033 1726867164.27439: variable 'item' from source: unknown 12033 1726867164.27485: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 12033 1726867164.27557: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.27560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.27562: variable 'omit' from source: magic vars 12033 1726867164.27658: variable 'ansible_distribution_major_version' from source: facts 12033 1726867164.27662: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867164.27666: variable 'omit' from source: magic vars 12033 1726867164.27680: variable 'omit' from source: magic vars 12033 1726867164.27708: variable 'item' from source: unknown 12033 1726867164.27749: variable 'item' from source: unknown 12033 1726867164.27760: variable 'omit' from source: magic vars 12033 1726867164.27773: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867164.27779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.27791: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.27800: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867164.27803: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.27806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.27845: Set connection var ansible_pipelining to False 12033 1726867164.27852: Set connection var ansible_shell_executable to /bin/sh 12033 1726867164.27858: Set connection var ansible_timeout to 10 12033 1726867164.27863: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867164.27865: Set connection var ansible_connection to ssh 12033 1726867164.27870: Set connection var ansible_shell_type to sh 12033 1726867164.27885: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.27890: variable 'ansible_connection' from source: unknown 12033 1726867164.27894: variable 'ansible_module_compression' from source: unknown 12033 1726867164.27896: variable 'ansible_shell_type' from source: unknown 12033 1726867164.27899: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.27902: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.27905: variable 'ansible_pipelining' from source: unknown 12033 1726867164.27907: variable 'ansible_timeout' from source: unknown 12033 1726867164.27909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.27961: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867164.27967: variable 'omit' from source: magic vars 12033 1726867164.27969: starting attempt loop 12033 1726867164.27972: running the handler 12033 1726867164.27990: variable 'lsr_cleanup' from source: include params 12033 1726867164.28033: variable 'lsr_cleanup' from source: include params 12033 1726867164.28046: handler run complete 12033 1726867164.28055: attempt loop complete, returning result 12033 1726867164.28066: variable 'item' from source: unknown 12033 1726867164.28129: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_bond_profile+device.yml", "tasks/remove_test_interfaces_with_dhcp.yml" ] } 12033 1726867164.28201: dumping result to json 12033 1726867164.28204: done dumping result, returning 12033 1726867164.28206: done running TaskExecutor() for managed_node3/TASK: Show item [0affcac9-a3a5-74bb-502b-000000000089] 12033 1726867164.28209: sending task result for task 0affcac9-a3a5-74bb-502b-000000000089 12033 1726867164.28247: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000089 12033 1726867164.28250: WORKER PROCESS EXITING 12033 1726867164.28299: no more pending results, returning what we have 12033 1726867164.28301: results queue empty 12033 1726867164.28302: checking for any_errors_fatal 12033 1726867164.28307: done checking for any_errors_fatal 12033 1726867164.28307: checking for max_fail_percentage 12033 1726867164.28308: done checking for max_fail_percentage 12033 1726867164.28309: checking to see if all hosts have failed and the running result is not ok 12033 1726867164.28310: done checking to see if all hosts have failed 12033 1726867164.28310: getting the remaining hosts for this loop 12033 1726867164.28312: done getting the remaining hosts for this loop 12033 1726867164.28315: getting the next task for host managed_node3 12033 1726867164.28319: done getting next task for host managed_node3 12033 1726867164.28321: ^ task is: TASK: Include the task 'show_interfaces.yml' 12033 1726867164.28323: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867164.28326: getting variables 12033 1726867164.28327: in VariableManager get_vars() 12033 1726867164.28348: Calling all_inventory to load vars for managed_node3 12033 1726867164.28350: Calling groups_inventory to load vars for managed_node3 12033 1726867164.28353: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867164.28363: Calling all_plugins_play to load vars for managed_node3 12033 1726867164.28365: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867164.28367: Calling groups_plugins_play to load vars for managed_node3 12033 1726867164.28481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867164.28592: done with get_vars() 12033 1726867164.28598: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 17:19:24 -0400 (0:00:00.060) 0:00:03.402 ****** 12033 1726867164.28653: entering _queue_task() for managed_node3/include_tasks 12033 1726867164.28827: worker is 1 (out of 1 available) 12033 1726867164.28839: exiting _queue_task() for managed_node3/include_tasks 12033 1726867164.28851: done queuing things up, now waiting for results queue to drain 12033 1726867164.28852: waiting for pending results... 12033 1726867164.29045: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 12033 1726867164.29105: in run() - task 0affcac9-a3a5-74bb-502b-00000000008a 12033 1726867164.29114: variable 'ansible_search_path' from source: unknown 12033 1726867164.29117: variable 'ansible_search_path' from source: unknown 12033 1726867164.29142: calling self._execute() 12033 1726867164.29208: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.29212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.29215: variable 'omit' from source: magic vars 12033 1726867164.29445: variable 'ansible_distribution_major_version' from source: facts 12033 1726867164.29454: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867164.29459: _execute() done 12033 1726867164.29462: dumping result to json 12033 1726867164.29464: done dumping result, returning 12033 1726867164.29471: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affcac9-a3a5-74bb-502b-00000000008a] 12033 1726867164.29474: sending task result for task 0affcac9-a3a5-74bb-502b-00000000008a 12033 1726867164.29583: no more pending results, returning what we have 12033 1726867164.29589: in VariableManager get_vars() 12033 1726867164.29616: Calling all_inventory to load vars for managed_node3 12033 1726867164.29618: Calling groups_inventory to load vars for managed_node3 12033 1726867164.29620: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867164.29629: Calling all_plugins_play to load vars for managed_node3 12033 1726867164.29631: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867164.29636: Calling groups_plugins_play to load vars for managed_node3 12033 1726867164.29769: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000008a 12033 1726867164.29772: WORKER PROCESS EXITING 12033 1726867164.29784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867164.29894: done with get_vars() 12033 1726867164.29899: variable 'ansible_search_path' from source: unknown 12033 1726867164.29900: variable 'ansible_search_path' from source: unknown 12033 1726867164.29927: we have included files to process 12033 1726867164.29927: generating all_blocks data 12033 1726867164.29929: done generating all_blocks data 12033 1726867164.29931: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 12033 1726867164.29932: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 12033 1726867164.29933: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 12033 1726867164.30027: in VariableManager get_vars() 12033 1726867164.30038: done with get_vars() 12033 1726867164.30110: done processing included file 12033 1726867164.30111: iterating over new_blocks loaded from include file 12033 1726867164.30112: in VariableManager get_vars() 12033 1726867164.30120: done with get_vars() 12033 1726867164.30121: filtering new block on tags 12033 1726867164.30140: done filtering new block on tags 12033 1726867164.30142: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 12033 1726867164.30145: extending task lists for all hosts with included blocks 12033 1726867164.30405: done extending task lists 12033 1726867164.30406: done processing included files 12033 1726867164.30406: results queue empty 12033 1726867164.30407: checking for any_errors_fatal 12033 1726867164.30410: done checking for any_errors_fatal 12033 1726867164.30411: checking for max_fail_percentage 12033 1726867164.30411: done checking for max_fail_percentage 12033 1726867164.30412: checking to see if all hosts have failed and the running result is not ok 12033 1726867164.30412: done checking to see if all hosts have failed 12033 1726867164.30413: getting the remaining hosts for this loop 12033 1726867164.30414: done getting the remaining hosts for this loop 12033 1726867164.30415: getting the next task for host managed_node3 12033 1726867164.30418: done getting next task for host managed_node3 12033 1726867164.30419: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 12033 1726867164.30420: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867164.30422: getting variables 12033 1726867164.30423: in VariableManager get_vars() 12033 1726867164.30428: Calling all_inventory to load vars for managed_node3 12033 1726867164.30430: Calling groups_inventory to load vars for managed_node3 12033 1726867164.30432: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867164.30445: Calling all_plugins_play to load vars for managed_node3 12033 1726867164.30448: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867164.30451: Calling groups_plugins_play to load vars for managed_node3 12033 1726867164.30642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867164.30838: done with get_vars() 12033 1726867164.30847: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 17:19:24 -0400 (0:00:00.022) 0:00:03.424 ****** 12033 1726867164.30915: entering _queue_task() for managed_node3/include_tasks 12033 1726867164.31133: worker is 1 (out of 1 available) 12033 1726867164.31146: exiting _queue_task() for managed_node3/include_tasks 12033 1726867164.31158: done queuing things up, now waiting for results queue to drain 12033 1726867164.31159: waiting for pending results... 12033 1726867164.31376: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 12033 1726867164.31472: in run() - task 0affcac9-a3a5-74bb-502b-0000000000b1 12033 1726867164.31495: variable 'ansible_search_path' from source: unknown 12033 1726867164.31504: variable 'ansible_search_path' from source: unknown 12033 1726867164.31539: calling self._execute() 12033 1726867164.31620: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.31631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.31646: variable 'omit' from source: magic vars 12033 1726867164.31983: variable 'ansible_distribution_major_version' from source: facts 12033 1726867164.32015: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867164.32018: _execute() done 12033 1726867164.32020: dumping result to json 12033 1726867164.32023: done dumping result, returning 12033 1726867164.32036: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affcac9-a3a5-74bb-502b-0000000000b1] 12033 1726867164.32123: sending task result for task 0affcac9-a3a5-74bb-502b-0000000000b1 12033 1726867164.32188: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000000b1 12033 1726867164.32191: WORKER PROCESS EXITING 12033 1726867164.32216: no more pending results, returning what we have 12033 1726867164.32221: in VariableManager get_vars() 12033 1726867164.32256: Calling all_inventory to load vars for managed_node3 12033 1726867164.32260: Calling groups_inventory to load vars for managed_node3 12033 1726867164.32263: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867164.32275: Calling all_plugins_play to load vars for managed_node3 12033 1726867164.32280: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867164.32283: Calling groups_plugins_play to load vars for managed_node3 12033 1726867164.32582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867164.32765: done with get_vars() 12033 1726867164.32780: variable 'ansible_search_path' from source: unknown 12033 1726867164.32781: variable 'ansible_search_path' from source: unknown 12033 1726867164.32812: we have included files to process 12033 1726867164.32814: generating all_blocks data 12033 1726867164.32815: done generating all_blocks data 12033 1726867164.32817: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 12033 1726867164.32818: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 12033 1726867164.32820: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 12033 1726867164.33114: done processing included file 12033 1726867164.33116: iterating over new_blocks loaded from include file 12033 1726867164.33117: in VariableManager get_vars() 12033 1726867164.33131: done with get_vars() 12033 1726867164.33132: filtering new block on tags 12033 1726867164.33164: done filtering new block on tags 12033 1726867164.33167: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 12033 1726867164.33171: extending task lists for all hosts with included blocks 12033 1726867164.33339: done extending task lists 12033 1726867164.33340: done processing included files 12033 1726867164.33341: results queue empty 12033 1726867164.33341: checking for any_errors_fatal 12033 1726867164.33344: done checking for any_errors_fatal 12033 1726867164.33345: checking for max_fail_percentage 12033 1726867164.33346: done checking for max_fail_percentage 12033 1726867164.33347: checking to see if all hosts have failed and the running result is not ok 12033 1726867164.33347: done checking to see if all hosts have failed 12033 1726867164.33348: getting the remaining hosts for this loop 12033 1726867164.33349: done getting the remaining hosts for this loop 12033 1726867164.33352: getting the next task for host managed_node3 12033 1726867164.33356: done getting next task for host managed_node3 12033 1726867164.33358: ^ task is: TASK: Gather current interface info 12033 1726867164.33360: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867164.33363: getting variables 12033 1726867164.33364: in VariableManager get_vars() 12033 1726867164.33371: Calling all_inventory to load vars for managed_node3 12033 1726867164.33373: Calling groups_inventory to load vars for managed_node3 12033 1726867164.33375: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867164.33382: Calling all_plugins_play to load vars for managed_node3 12033 1726867164.33384: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867164.33387: Calling groups_plugins_play to load vars for managed_node3 12033 1726867164.33516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867164.33627: done with get_vars() 12033 1726867164.33633: done getting variables 12033 1726867164.33661: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 17:19:24 -0400 (0:00:00.027) 0:00:03.452 ****** 12033 1726867164.33682: entering _queue_task() for managed_node3/command 12033 1726867164.33842: worker is 1 (out of 1 available) 12033 1726867164.33855: exiting _queue_task() for managed_node3/command 12033 1726867164.33867: done queuing things up, now waiting for results queue to drain 12033 1726867164.33869: waiting for pending results... 12033 1726867164.34016: running TaskExecutor() for managed_node3/TASK: Gather current interface info 12033 1726867164.34080: in run() - task 0affcac9-a3a5-74bb-502b-0000000000ec 12033 1726867164.34092: variable 'ansible_search_path' from source: unknown 12033 1726867164.34097: variable 'ansible_search_path' from source: unknown 12033 1726867164.34129: calling self._execute() 12033 1726867164.34180: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.34185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.34196: variable 'omit' from source: magic vars 12033 1726867164.34455: variable 'ansible_distribution_major_version' from source: facts 12033 1726867164.34464: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867164.34469: variable 'omit' from source: magic vars 12033 1726867164.34507: variable 'omit' from source: magic vars 12033 1726867164.34530: variable 'omit' from source: magic vars 12033 1726867164.34562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867164.34590: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867164.34608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867164.34621: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.34630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.34655: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867164.34659: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.34662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.34728: Set connection var ansible_pipelining to False 12033 1726867164.34734: Set connection var ansible_shell_executable to /bin/sh 12033 1726867164.34741: Set connection var ansible_timeout to 10 12033 1726867164.34746: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867164.34749: Set connection var ansible_connection to ssh 12033 1726867164.34754: Set connection var ansible_shell_type to sh 12033 1726867164.34775: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.34780: variable 'ansible_connection' from source: unknown 12033 1726867164.34783: variable 'ansible_module_compression' from source: unknown 12033 1726867164.34785: variable 'ansible_shell_type' from source: unknown 12033 1726867164.34788: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.34790: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.34795: variable 'ansible_pipelining' from source: unknown 12033 1726867164.34797: variable 'ansible_timeout' from source: unknown 12033 1726867164.34801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.34899: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867164.34907: variable 'omit' from source: magic vars 12033 1726867164.34912: starting attempt loop 12033 1726867164.34914: running the handler 12033 1726867164.34926: _low_level_execute_command(): starting 12033 1726867164.34933: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867164.35545: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867164.35550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867164.35585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867164.35588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867164.35655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867164.37346: stdout chunk (state=3): >>>/root <<< 12033 1726867164.37482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867164.37486: stdout chunk (state=3): >>><<< 12033 1726867164.37506: stderr chunk (state=3): >>><<< 12033 1726867164.37605: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867164.37608: _low_level_execute_command(): starting 12033 1726867164.37611: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867164.3752348-12193-74909228152678 `" && echo ansible-tmp-1726867164.3752348-12193-74909228152678="` echo /root/.ansible/tmp/ansible-tmp-1726867164.3752348-12193-74909228152678 `" ) && sleep 0' 12033 1726867164.38118: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867164.38134: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867164.38150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867164.38197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867164.38219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867164.38237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867164.38298: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867164.38328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867164.38346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867164.38368: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867164.38448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867164.40334: stdout chunk (state=3): >>>ansible-tmp-1726867164.3752348-12193-74909228152678=/root/.ansible/tmp/ansible-tmp-1726867164.3752348-12193-74909228152678 <<< 12033 1726867164.40462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867164.40484: stdout chunk (state=3): >>><<< 12033 1726867164.40487: stderr chunk (state=3): >>><<< 12033 1726867164.40499: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867164.3752348-12193-74909228152678=/root/.ansible/tmp/ansible-tmp-1726867164.3752348-12193-74909228152678 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867164.40524: variable 'ansible_module_compression' from source: unknown 12033 1726867164.40564: ANSIBALLZ: Using generic lock for ansible.legacy.command 12033 1726867164.40567: ANSIBALLZ: Acquiring lock 12033 1726867164.40569: ANSIBALLZ: Lock acquired: 139897899327968 12033 1726867164.40572: ANSIBALLZ: Creating module 12033 1726867164.60785: ANSIBALLZ: Writing module into payload 12033 1726867164.60984: ANSIBALLZ: Writing module 12033 1726867164.60990: ANSIBALLZ: Renaming module 12033 1726867164.60994: ANSIBALLZ: Done creating module 12033 1726867164.60998: variable 'ansible_facts' from source: unknown 12033 1726867164.61001: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867164.3752348-12193-74909228152678/AnsiballZ_command.py 12033 1726867164.61248: Sending initial data 12033 1726867164.61258: Sent initial data (155 bytes) 12033 1726867164.62597: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867164.62626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867164.62644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867164.62895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867164.62984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867164.64625: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12033 1726867164.64640: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867164.64669: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867164.64712: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpm_w_0gab /root/.ansible/tmp/ansible-tmp-1726867164.3752348-12193-74909228152678/AnsiballZ_command.py <<< 12033 1726867164.64718: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867164.3752348-12193-74909228152678/AnsiballZ_command.py" <<< 12033 1726867164.64841: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpm_w_0gab" to remote "/root/.ansible/tmp/ansible-tmp-1726867164.3752348-12193-74909228152678/AnsiballZ_command.py" <<< 12033 1726867164.64854: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867164.3752348-12193-74909228152678/AnsiballZ_command.py" <<< 12033 1726867164.66195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867164.66227: stderr chunk (state=3): >>><<< 12033 1726867164.66236: stdout chunk (state=3): >>><<< 12033 1726867164.66302: done transferring module to remote 12033 1726867164.66399: _low_level_execute_command(): starting 12033 1726867164.66410: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867164.3752348-12193-74909228152678/ /root/.ansible/tmp/ansible-tmp-1726867164.3752348-12193-74909228152678/AnsiballZ_command.py && sleep 0' 12033 1726867164.67957: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867164.68206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867164.68262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867164.70228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867164.70231: stdout chunk (state=3): >>><<< 12033 1726867164.70234: stderr chunk (state=3): >>><<< 12033 1726867164.70250: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867164.70258: _low_level_execute_command(): starting 12033 1726867164.70268: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867164.3752348-12193-74909228152678/AnsiballZ_command.py && sleep 0' 12033 1726867164.71370: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867164.71390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867164.71407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867164.71505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867164.71598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867164.71618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867164.71635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867164.71668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867164.71900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867164.87858: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:19:24.872637", "end": "2024-09-20 17:19:24.876181", "delta": "0:00:00.003544", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867164.89643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867164.89656: stdout chunk (state=3): >>><<< 12033 1726867164.89669: stderr chunk (state=3): >>><<< 12033 1726867164.89699: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:19:24.872637", "end": "2024-09-20 17:19:24.876181", "delta": "0:00:00.003544", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867164.89742: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867164.3752348-12193-74909228152678/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867164.89758: _low_level_execute_command(): starting 12033 1726867164.89769: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867164.3752348-12193-74909228152678/ > /dev/null 2>&1 && sleep 0' 12033 1726867164.90350: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867164.90362: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867164.90373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867164.90395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867164.90410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867164.90419: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867164.90434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867164.90523: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867164.90551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867164.90617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867164.92853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867164.92862: stdout chunk (state=3): >>><<< 12033 1726867164.92873: stderr chunk (state=3): >>><<< 12033 1726867164.92895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867164.92906: handler run complete 12033 1726867164.92934: Evaluated conditional (False): False 12033 1726867164.92948: attempt loop complete, returning result 12033 1726867164.92955: _execute() done 12033 1726867164.92961: dumping result to json 12033 1726867164.92969: done dumping result, returning 12033 1726867164.92982: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affcac9-a3a5-74bb-502b-0000000000ec] 12033 1726867164.93083: sending task result for task 0affcac9-a3a5-74bb-502b-0000000000ec ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003544", "end": "2024-09-20 17:19:24.876181", "rc": 0, "start": "2024-09-20 17:19:24.872637" } STDOUT: bonding_masters eth0 lo 12033 1726867164.93231: no more pending results, returning what we have 12033 1726867164.93235: results queue empty 12033 1726867164.93237: checking for any_errors_fatal 12033 1726867164.93238: done checking for any_errors_fatal 12033 1726867164.93239: checking for max_fail_percentage 12033 1726867164.93241: done checking for max_fail_percentage 12033 1726867164.93241: checking to see if all hosts have failed and the running result is not ok 12033 1726867164.93242: done checking to see if all hosts have failed 12033 1726867164.93243: getting the remaining hosts for this loop 12033 1726867164.93244: done getting the remaining hosts for this loop 12033 1726867164.93248: getting the next task for host managed_node3 12033 1726867164.93255: done getting next task for host managed_node3 12033 1726867164.93257: ^ task is: TASK: Set current_interfaces 12033 1726867164.93262: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867164.93266: getting variables 12033 1726867164.93268: in VariableManager get_vars() 12033 1726867164.93301: Calling all_inventory to load vars for managed_node3 12033 1726867164.93303: Calling groups_inventory to load vars for managed_node3 12033 1726867164.93307: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867164.93318: Calling all_plugins_play to load vars for managed_node3 12033 1726867164.93321: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867164.93324: Calling groups_plugins_play to load vars for managed_node3 12033 1726867164.93606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867164.94063: done with get_vars() 12033 1726867164.94074: done getting variables 12033 1726867164.94139: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000000ec 12033 1726867164.94143: WORKER PROCESS EXITING 12033 1726867164.94176: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 17:19:24 -0400 (0:00:00.605) 0:00:04.057 ****** 12033 1726867164.94210: entering _queue_task() for managed_node3/set_fact 12033 1726867164.94469: worker is 1 (out of 1 available) 12033 1726867164.94685: exiting _queue_task() for managed_node3/set_fact 12033 1726867164.94694: done queuing things up, now waiting for results queue to drain 12033 1726867164.94696: waiting for pending results... 12033 1726867164.94729: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 12033 1726867164.94842: in run() - task 0affcac9-a3a5-74bb-502b-0000000000ed 12033 1726867164.94860: variable 'ansible_search_path' from source: unknown 12033 1726867164.94868: variable 'ansible_search_path' from source: unknown 12033 1726867164.94908: calling self._execute() 12033 1726867164.94986: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.94998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.95011: variable 'omit' from source: magic vars 12033 1726867164.95360: variable 'ansible_distribution_major_version' from source: facts 12033 1726867164.95375: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867164.95388: variable 'omit' from source: magic vars 12033 1726867164.95440: variable 'omit' from source: magic vars 12033 1726867164.95550: variable '_current_interfaces' from source: set_fact 12033 1726867164.95620: variable 'omit' from source: magic vars 12033 1726867164.95661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867164.95708: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867164.95732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867164.95756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.95773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.95811: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867164.95820: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.95828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.95932: Set connection var ansible_pipelining to False 12033 1726867164.95947: Set connection var ansible_shell_executable to /bin/sh 12033 1726867164.95960: Set connection var ansible_timeout to 10 12033 1726867164.95968: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867164.95973: Set connection var ansible_connection to ssh 12033 1726867164.95982: Set connection var ansible_shell_type to sh 12033 1726867164.96007: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.96014: variable 'ansible_connection' from source: unknown 12033 1726867164.96019: variable 'ansible_module_compression' from source: unknown 12033 1726867164.96024: variable 'ansible_shell_type' from source: unknown 12033 1726867164.96029: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.96034: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.96040: variable 'ansible_pipelining' from source: unknown 12033 1726867164.96045: variable 'ansible_timeout' from source: unknown 12033 1726867164.96050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.96171: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867164.96230: variable 'omit' from source: magic vars 12033 1726867164.96242: starting attempt loop 12033 1726867164.96249: running the handler 12033 1726867164.96273: handler run complete 12033 1726867164.96395: attempt loop complete, returning result 12033 1726867164.96399: _execute() done 12033 1726867164.96401: dumping result to json 12033 1726867164.96403: done dumping result, returning 12033 1726867164.96406: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affcac9-a3a5-74bb-502b-0000000000ed] 12033 1726867164.96408: sending task result for task 0affcac9-a3a5-74bb-502b-0000000000ed 12033 1726867164.96480: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000000ed 12033 1726867164.96485: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 12033 1726867164.96538: no more pending results, returning what we have 12033 1726867164.96541: results queue empty 12033 1726867164.96542: checking for any_errors_fatal 12033 1726867164.96551: done checking for any_errors_fatal 12033 1726867164.96552: checking for max_fail_percentage 12033 1726867164.96554: done checking for max_fail_percentage 12033 1726867164.96555: checking to see if all hosts have failed and the running result is not ok 12033 1726867164.96555: done checking to see if all hosts have failed 12033 1726867164.96556: getting the remaining hosts for this loop 12033 1726867164.96557: done getting the remaining hosts for this loop 12033 1726867164.96560: getting the next task for host managed_node3 12033 1726867164.96569: done getting next task for host managed_node3 12033 1726867164.96571: ^ task is: TASK: Show current_interfaces 12033 1726867164.96574: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867164.96681: getting variables 12033 1726867164.96683: in VariableManager get_vars() 12033 1726867164.96718: Calling all_inventory to load vars for managed_node3 12033 1726867164.96720: Calling groups_inventory to load vars for managed_node3 12033 1726867164.96724: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867164.96732: Calling all_plugins_play to load vars for managed_node3 12033 1726867164.96735: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867164.96738: Calling groups_plugins_play to load vars for managed_node3 12033 1726867164.96932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867164.97138: done with get_vars() 12033 1726867164.97155: done getting variables 12033 1726867164.97216: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 17:19:24 -0400 (0:00:00.030) 0:00:04.088 ****** 12033 1726867164.97246: entering _queue_task() for managed_node3/debug 12033 1726867164.97499: worker is 1 (out of 1 available) 12033 1726867164.97512: exiting _queue_task() for managed_node3/debug 12033 1726867164.97523: done queuing things up, now waiting for results queue to drain 12033 1726867164.97525: waiting for pending results... 12033 1726867164.97894: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 12033 1726867164.97900: in run() - task 0affcac9-a3a5-74bb-502b-0000000000b2 12033 1726867164.97903: variable 'ansible_search_path' from source: unknown 12033 1726867164.97906: variable 'ansible_search_path' from source: unknown 12033 1726867164.97908: calling self._execute() 12033 1726867164.97948: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.97960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.97976: variable 'omit' from source: magic vars 12033 1726867164.98325: variable 'ansible_distribution_major_version' from source: facts 12033 1726867164.98342: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867164.98355: variable 'omit' from source: magic vars 12033 1726867164.98405: variable 'omit' from source: magic vars 12033 1726867164.98507: variable 'current_interfaces' from source: set_fact 12033 1726867164.98542: variable 'omit' from source: magic vars 12033 1726867164.98591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867164.98630: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867164.98654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867164.98681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.98702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867164.98735: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867164.98745: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.98796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.98859: Set connection var ansible_pipelining to False 12033 1726867164.98874: Set connection var ansible_shell_executable to /bin/sh 12033 1726867164.98889: Set connection var ansible_timeout to 10 12033 1726867164.98903: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867164.98913: Set connection var ansible_connection to ssh 12033 1726867164.98924: Set connection var ansible_shell_type to sh 12033 1726867164.98948: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.98983: variable 'ansible_connection' from source: unknown 12033 1726867164.98987: variable 'ansible_module_compression' from source: unknown 12033 1726867164.98989: variable 'ansible_shell_type' from source: unknown 12033 1726867164.98991: variable 'ansible_shell_executable' from source: unknown 12033 1726867164.98993: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867164.98995: variable 'ansible_pipelining' from source: unknown 12033 1726867164.98997: variable 'ansible_timeout' from source: unknown 12033 1726867164.98999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867164.99135: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867164.99185: variable 'omit' from source: magic vars 12033 1726867164.99188: starting attempt loop 12033 1726867164.99190: running the handler 12033 1726867164.99206: handler run complete 12033 1726867164.99223: attempt loop complete, returning result 12033 1726867164.99233: _execute() done 12033 1726867164.99238: dumping result to json 12033 1726867164.99244: done dumping result, returning 12033 1726867164.99253: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affcac9-a3a5-74bb-502b-0000000000b2] 12033 1726867164.99280: sending task result for task 0affcac9-a3a5-74bb-502b-0000000000b2 12033 1726867164.99494: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000000b2 12033 1726867164.99497: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 12033 1726867164.99529: no more pending results, returning what we have 12033 1726867164.99532: results queue empty 12033 1726867164.99533: checking for any_errors_fatal 12033 1726867164.99536: done checking for any_errors_fatal 12033 1726867164.99537: checking for max_fail_percentage 12033 1726867164.99539: done checking for max_fail_percentage 12033 1726867164.99539: checking to see if all hosts have failed and the running result is not ok 12033 1726867164.99540: done checking to see if all hosts have failed 12033 1726867164.99541: getting the remaining hosts for this loop 12033 1726867164.99542: done getting the remaining hosts for this loop 12033 1726867164.99545: getting the next task for host managed_node3 12033 1726867164.99551: done getting next task for host managed_node3 12033 1726867164.99553: ^ task is: TASK: Setup 12033 1726867164.99556: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867164.99559: getting variables 12033 1726867164.99560: in VariableManager get_vars() 12033 1726867164.99583: Calling all_inventory to load vars for managed_node3 12033 1726867164.99586: Calling groups_inventory to load vars for managed_node3 12033 1726867164.99589: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867164.99596: Calling all_plugins_play to load vars for managed_node3 12033 1726867164.99598: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867164.99601: Calling groups_plugins_play to load vars for managed_node3 12033 1726867164.99847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867165.00031: done with get_vars() 12033 1726867165.00041: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 17:19:25 -0400 (0:00:00.028) 0:00:04.117 ****** 12033 1726867165.00125: entering _queue_task() for managed_node3/include_tasks 12033 1726867165.00329: worker is 1 (out of 1 available) 12033 1726867165.00341: exiting _queue_task() for managed_node3/include_tasks 12033 1726867165.00353: done queuing things up, now waiting for results queue to drain 12033 1726867165.00355: waiting for pending results... 12033 1726867165.00584: running TaskExecutor() for managed_node3/TASK: Setup 12033 1726867165.00673: in run() - task 0affcac9-a3a5-74bb-502b-00000000008b 12033 1726867165.00699: variable 'ansible_search_path' from source: unknown 12033 1726867165.00710: variable 'ansible_search_path' from source: unknown 12033 1726867165.00755: variable 'lsr_setup' from source: include params 12033 1726867165.00945: variable 'lsr_setup' from source: include params 12033 1726867165.01032: variable 'omit' from source: magic vars 12033 1726867165.01121: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867165.01141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867165.01156: variable 'omit' from source: magic vars 12033 1726867165.01582: variable 'ansible_distribution_major_version' from source: facts 12033 1726867165.01585: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867165.01588: variable 'item' from source: unknown 12033 1726867165.01590: variable 'item' from source: unknown 12033 1726867165.01592: variable 'item' from source: unknown 12033 1726867165.01594: variable 'item' from source: unknown 12033 1726867165.01803: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867165.01806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867165.01809: variable 'omit' from source: magic vars 12033 1726867165.01915: variable 'ansible_distribution_major_version' from source: facts 12033 1726867165.01927: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867165.01937: variable 'item' from source: unknown 12033 1726867165.02000: variable 'item' from source: unknown 12033 1726867165.02037: variable 'item' from source: unknown 12033 1726867165.02099: variable 'item' from source: unknown 12033 1726867165.02383: dumping result to json 12033 1726867165.02386: done dumping result, returning 12033 1726867165.02389: done running TaskExecutor() for managed_node3/TASK: Setup [0affcac9-a3a5-74bb-502b-00000000008b] 12033 1726867165.02391: sending task result for task 0affcac9-a3a5-74bb-502b-00000000008b 12033 1726867165.02428: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000008b 12033 1726867165.02432: WORKER PROCESS EXITING 12033 1726867165.02460: no more pending results, returning what we have 12033 1726867165.02464: in VariableManager get_vars() 12033 1726867165.02492: Calling all_inventory to load vars for managed_node3 12033 1726867165.02495: Calling groups_inventory to load vars for managed_node3 12033 1726867165.02498: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867165.02507: Calling all_plugins_play to load vars for managed_node3 12033 1726867165.02510: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867165.02513: Calling groups_plugins_play to load vars for managed_node3 12033 1726867165.02674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867165.02871: done with get_vars() 12033 1726867165.02880: variable 'ansible_search_path' from source: unknown 12033 1726867165.02882: variable 'ansible_search_path' from source: unknown 12033 1726867165.02919: variable 'ansible_search_path' from source: unknown 12033 1726867165.02920: variable 'ansible_search_path' from source: unknown 12033 1726867165.02949: we have included files to process 12033 1726867165.02950: generating all_blocks data 12033 1726867165.02951: done generating all_blocks data 12033 1726867165.02955: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 12033 1726867165.02956: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 12033 1726867165.02958: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 12033 1726867165.03938: done processing included file 12033 1726867165.03940: iterating over new_blocks loaded from include file 12033 1726867165.03941: in VariableManager get_vars() 12033 1726867165.03952: done with get_vars() 12033 1726867165.03954: filtering new block on tags 12033 1726867165.03999: done filtering new block on tags 12033 1726867165.04002: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml for managed_node3 => (item=tasks/create_test_interfaces_with_dhcp.yml) 12033 1726867165.04007: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 12033 1726867165.04008: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 12033 1726867165.04011: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 12033 1726867165.04127: in VariableManager get_vars() 12033 1726867165.04144: done with get_vars() 12033 1726867165.04150: variable 'item' from source: include params 12033 1726867165.04249: variable 'item' from source: include params 12033 1726867165.04288: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12033 1726867165.04395: in VariableManager get_vars() 12033 1726867165.04413: done with get_vars() 12033 1726867165.04547: in VariableManager get_vars() 12033 1726867165.04563: done with get_vars() 12033 1726867165.04568: variable 'item' from source: include params 12033 1726867165.04629: variable 'item' from source: include params 12033 1726867165.04658: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12033 1726867165.04733: in VariableManager get_vars() 12033 1726867165.04749: done with get_vars() 12033 1726867165.04845: done processing included file 12033 1726867165.04847: iterating over new_blocks loaded from include file 12033 1726867165.04848: in VariableManager get_vars() 12033 1726867165.04860: done with get_vars() 12033 1726867165.04861: filtering new block on tags 12033 1726867165.04955: done filtering new block on tags 12033 1726867165.04958: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml for managed_node3 => (item=tasks/assert_dhcp_device_present.yml) 12033 1726867165.04962: extending task lists for all hosts with included blocks 12033 1726867165.05536: done extending task lists 12033 1726867165.05537: done processing included files 12033 1726867165.05538: results queue empty 12033 1726867165.05539: checking for any_errors_fatal 12033 1726867165.05542: done checking for any_errors_fatal 12033 1726867165.05543: checking for max_fail_percentage 12033 1726867165.05544: done checking for max_fail_percentage 12033 1726867165.05545: checking to see if all hosts have failed and the running result is not ok 12033 1726867165.05545: done checking to see if all hosts have failed 12033 1726867165.05550: getting the remaining hosts for this loop 12033 1726867165.05551: done getting the remaining hosts for this loop 12033 1726867165.05554: getting the next task for host managed_node3 12033 1726867165.05558: done getting next task for host managed_node3 12033 1726867165.05559: ^ task is: TASK: Install dnsmasq 12033 1726867165.05562: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867165.05564: getting variables 12033 1726867165.05565: in VariableManager get_vars() 12033 1726867165.05572: Calling all_inventory to load vars for managed_node3 12033 1726867165.05574: Calling groups_inventory to load vars for managed_node3 12033 1726867165.05578: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867165.05584: Calling all_plugins_play to load vars for managed_node3 12033 1726867165.05586: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867165.05588: Calling groups_plugins_play to load vars for managed_node3 12033 1726867165.05724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867165.05912: done with get_vars() 12033 1726867165.05920: done getting variables 12033 1726867165.05955: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 17:19:25 -0400 (0:00:00.058) 0:00:04.175 ****** 12033 1726867165.05982: entering _queue_task() for managed_node3/package 12033 1726867165.06200: worker is 1 (out of 1 available) 12033 1726867165.06211: exiting _queue_task() for managed_node3/package 12033 1726867165.06221: done queuing things up, now waiting for results queue to drain 12033 1726867165.06223: waiting for pending results... 12033 1726867165.06451: running TaskExecutor() for managed_node3/TASK: Install dnsmasq 12033 1726867165.06557: in run() - task 0affcac9-a3a5-74bb-502b-000000000112 12033 1726867165.06581: variable 'ansible_search_path' from source: unknown 12033 1726867165.06675: variable 'ansible_search_path' from source: unknown 12033 1726867165.06682: calling self._execute() 12033 1726867165.06738: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867165.06750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867165.06763: variable 'omit' from source: magic vars 12033 1726867165.07095: variable 'ansible_distribution_major_version' from source: facts 12033 1726867165.07111: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867165.07129: variable 'omit' from source: magic vars 12033 1726867165.07171: variable 'omit' from source: magic vars 12033 1726867165.07336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867165.09484: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867165.09488: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867165.09491: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867165.09514: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867165.09545: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867165.09646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867165.09682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867165.09719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867165.09765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867165.09788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867165.09892: variable '__network_is_ostree' from source: set_fact 12033 1726867165.09902: variable 'omit' from source: magic vars 12033 1726867165.09933: variable 'omit' from source: magic vars 12033 1726867165.09959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867165.09991: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867165.10012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867165.10036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867165.10049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867165.10080: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867165.10088: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867165.10094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867165.10192: Set connection var ansible_pipelining to False 12033 1726867165.10205: Set connection var ansible_shell_executable to /bin/sh 12033 1726867165.10217: Set connection var ansible_timeout to 10 12033 1726867165.10225: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867165.10230: Set connection var ansible_connection to ssh 12033 1726867165.10241: Set connection var ansible_shell_type to sh 12033 1726867165.10262: variable 'ansible_shell_executable' from source: unknown 12033 1726867165.10269: variable 'ansible_connection' from source: unknown 12033 1726867165.10274: variable 'ansible_module_compression' from source: unknown 12033 1726867165.10281: variable 'ansible_shell_type' from source: unknown 12033 1726867165.10286: variable 'ansible_shell_executable' from source: unknown 12033 1726867165.10292: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867165.10298: variable 'ansible_pipelining' from source: unknown 12033 1726867165.10303: variable 'ansible_timeout' from source: unknown 12033 1726867165.10309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867165.10400: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867165.10417: variable 'omit' from source: magic vars 12033 1726867165.10426: starting attempt loop 12033 1726867165.10457: running the handler 12033 1726867165.10460: variable 'ansible_facts' from source: unknown 12033 1726867165.10462: variable 'ansible_facts' from source: unknown 12033 1726867165.10495: _low_level_execute_command(): starting 12033 1726867165.10505: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867165.11198: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867165.11295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867165.11327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867165.11352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867165.11369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867165.11457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867165.13144: stdout chunk (state=3): >>>/root <<< 12033 1726867165.13383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867165.13387: stdout chunk (state=3): >>><<< 12033 1726867165.13391: stderr chunk (state=3): >>><<< 12033 1726867165.13394: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867165.13404: _low_level_execute_command(): starting 12033 1726867165.13407: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867165.1331506-12224-191050071391064 `" && echo ansible-tmp-1726867165.1331506-12224-191050071391064="` echo /root/.ansible/tmp/ansible-tmp-1726867165.1331506-12224-191050071391064 `" ) && sleep 0' 12033 1726867165.13973: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867165.13991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867165.14004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867165.14021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867165.14036: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867165.14046: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867165.14082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867165.14101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867165.14186: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867165.14219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867165.14300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867165.16175: stdout chunk (state=3): >>>ansible-tmp-1726867165.1331506-12224-191050071391064=/root/.ansible/tmp/ansible-tmp-1726867165.1331506-12224-191050071391064 <<< 12033 1726867165.16320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867165.16346: stdout chunk (state=3): >>><<< 12033 1726867165.16350: stderr chunk (state=3): >>><<< 12033 1726867165.16369: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867165.1331506-12224-191050071391064=/root/.ansible/tmp/ansible-tmp-1726867165.1331506-12224-191050071391064 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867165.16583: variable 'ansible_module_compression' from source: unknown 12033 1726867165.16591: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 12033 1726867165.16594: ANSIBALLZ: Acquiring lock 12033 1726867165.16596: ANSIBALLZ: Lock acquired: 139897899327968 12033 1726867165.16598: ANSIBALLZ: Creating module 12033 1726867165.34124: ANSIBALLZ: Writing module into payload 12033 1726867165.34334: ANSIBALLZ: Writing module 12033 1726867165.34361: ANSIBALLZ: Renaming module 12033 1726867165.34380: ANSIBALLZ: Done creating module 12033 1726867165.34413: variable 'ansible_facts' from source: unknown 12033 1726867165.34520: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867165.1331506-12224-191050071391064/AnsiballZ_dnf.py 12033 1726867165.34707: Sending initial data 12033 1726867165.34710: Sent initial data (152 bytes) 12033 1726867165.35339: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867165.35353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867165.35399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867165.35414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867165.35506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867165.35529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867165.35613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867165.37245: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867165.37373: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867165.37427: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmp75xq244z /root/.ansible/tmp/ansible-tmp-1726867165.1331506-12224-191050071391064/AnsiballZ_dnf.py <<< 12033 1726867165.37430: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867165.1331506-12224-191050071391064/AnsiballZ_dnf.py" <<< 12033 1726867165.37465: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmp75xq244z" to remote "/root/.ansible/tmp/ansible-tmp-1726867165.1331506-12224-191050071391064/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867165.1331506-12224-191050071391064/AnsiballZ_dnf.py" <<< 12033 1726867165.38808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867165.38858: stderr chunk (state=3): >>><<< 12033 1726867165.38870: stdout chunk (state=3): >>><<< 12033 1726867165.38915: done transferring module to remote 12033 1726867165.39002: _low_level_execute_command(): starting 12033 1726867165.39006: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867165.1331506-12224-191050071391064/ /root/.ansible/tmp/ansible-tmp-1726867165.1331506-12224-191050071391064/AnsiballZ_dnf.py && sleep 0' 12033 1726867165.39568: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867165.39666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867165.39682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867165.39710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867165.39731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867165.39808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867165.41599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867165.41627: stdout chunk (state=3): >>><<< 12033 1726867165.41983: stderr chunk (state=3): >>><<< 12033 1726867165.41987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867165.41992: _low_level_execute_command(): starting 12033 1726867165.41995: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867165.1331506-12224-191050071391064/AnsiballZ_dnf.py && sleep 0' 12033 1726867165.42964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867165.42976: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867165.42994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867165.43038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867165.43049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867165.43295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867165.43368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867165.84316: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 12033 1726867165.88333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867165.88337: stdout chunk (state=3): >>><<< 12033 1726867165.88345: stderr chunk (state=3): >>><<< 12033 1726867165.88364: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867165.88411: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867165.1331506-12224-191050071391064/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867165.88417: _low_level_execute_command(): starting 12033 1726867165.88424: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867165.1331506-12224-191050071391064/ > /dev/null 2>&1 && sleep 0' 12033 1726867165.89010: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867165.89025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867165.89040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867165.89058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867165.89079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867165.89096: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867165.89192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867165.89218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867165.89293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867165.91297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867165.91396: stderr chunk (state=3): >>><<< 12033 1726867165.91399: stdout chunk (state=3): >>><<< 12033 1726867165.91401: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867165.91404: handler run complete 12033 1726867165.91529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867165.91722: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867165.91765: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867165.91804: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867165.91843: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867165.91919: variable '__install_status' from source: unknown 12033 1726867165.91951: Evaluated conditional (__install_status is success): True 12033 1726867165.91975: attempt loop complete, returning result 12033 1726867165.91987: _execute() done 12033 1726867165.92050: dumping result to json 12033 1726867165.92053: done dumping result, returning 12033 1726867165.92055: done running TaskExecutor() for managed_node3/TASK: Install dnsmasq [0affcac9-a3a5-74bb-502b-000000000112] 12033 1726867165.92058: sending task result for task 0affcac9-a3a5-74bb-502b-000000000112 ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 12033 1726867165.92397: no more pending results, returning what we have 12033 1726867165.92401: results queue empty 12033 1726867165.92403: checking for any_errors_fatal 12033 1726867165.92404: done checking for any_errors_fatal 12033 1726867165.92405: checking for max_fail_percentage 12033 1726867165.92406: done checking for max_fail_percentage 12033 1726867165.92407: checking to see if all hosts have failed and the running result is not ok 12033 1726867165.92408: done checking to see if all hosts have failed 12033 1726867165.92408: getting the remaining hosts for this loop 12033 1726867165.92410: done getting the remaining hosts for this loop 12033 1726867165.92413: getting the next task for host managed_node3 12033 1726867165.92420: done getting next task for host managed_node3 12033 1726867165.92422: ^ task is: TASK: Install pgrep, sysctl 12033 1726867165.92425: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867165.92429: getting variables 12033 1726867165.92430: in VariableManager get_vars() 12033 1726867165.92459: Calling all_inventory to load vars for managed_node3 12033 1726867165.92462: Calling groups_inventory to load vars for managed_node3 12033 1726867165.92465: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867165.92599: Calling all_plugins_play to load vars for managed_node3 12033 1726867165.92602: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867165.92605: Calling groups_plugins_play to load vars for managed_node3 12033 1726867165.92918: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000112 12033 1726867165.92921: WORKER PROCESS EXITING 12033 1726867165.92943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867165.93165: done with get_vars() 12033 1726867165.93176: done getting variables 12033 1726867165.93231: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 17:19:25 -0400 (0:00:00.872) 0:00:05.048 ****** 12033 1726867165.93265: entering _queue_task() for managed_node3/package 12033 1726867165.93507: worker is 1 (out of 1 available) 12033 1726867165.93518: exiting _queue_task() for managed_node3/package 12033 1726867165.93529: done queuing things up, now waiting for results queue to drain 12033 1726867165.93531: waiting for pending results... 12033 1726867165.93769: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 12033 1726867165.93879: in run() - task 0affcac9-a3a5-74bb-502b-000000000113 12033 1726867165.93905: variable 'ansible_search_path' from source: unknown 12033 1726867165.93913: variable 'ansible_search_path' from source: unknown 12033 1726867165.93950: calling self._execute() 12033 1726867165.94033: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867165.94045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867165.94058: variable 'omit' from source: magic vars 12033 1726867165.94407: variable 'ansible_distribution_major_version' from source: facts 12033 1726867165.94422: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867165.94528: variable 'ansible_os_family' from source: facts 12033 1726867165.94549: Evaluated conditional (ansible_os_family == 'RedHat'): True 12033 1726867165.94728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867165.95051: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867165.95105: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867165.95143: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867165.95180: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867165.95264: variable 'ansible_distribution_major_version' from source: facts 12033 1726867165.95284: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 12033 1726867165.95292: when evaluation is False, skipping this task 12033 1726867165.95382: _execute() done 12033 1726867165.95386: dumping result to json 12033 1726867165.95388: done dumping result, returning 12033 1726867165.95391: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0affcac9-a3a5-74bb-502b-000000000113] 12033 1726867165.95393: sending task result for task 0affcac9-a3a5-74bb-502b-000000000113 12033 1726867165.95521: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000113 12033 1726867165.95524: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 12033 1726867165.95571: no more pending results, returning what we have 12033 1726867165.95575: results queue empty 12033 1726867165.95576: checking for any_errors_fatal 12033 1726867165.95585: done checking for any_errors_fatal 12033 1726867165.95585: checking for max_fail_percentage 12033 1726867165.95587: done checking for max_fail_percentage 12033 1726867165.95588: checking to see if all hosts have failed and the running result is not ok 12033 1726867165.95589: done checking to see if all hosts have failed 12033 1726867165.95590: getting the remaining hosts for this loop 12033 1726867165.95592: done getting the remaining hosts for this loop 12033 1726867165.95595: getting the next task for host managed_node3 12033 1726867165.95603: done getting next task for host managed_node3 12033 1726867165.95605: ^ task is: TASK: Install pgrep, sysctl 12033 1726867165.95609: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867165.95613: getting variables 12033 1726867165.95615: in VariableManager get_vars() 12033 1726867165.95751: Calling all_inventory to load vars for managed_node3 12033 1726867165.95754: Calling groups_inventory to load vars for managed_node3 12033 1726867165.95757: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867165.95765: Calling all_plugins_play to load vars for managed_node3 12033 1726867165.95768: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867165.95771: Calling groups_plugins_play to load vars for managed_node3 12033 1726867165.95949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867165.96171: done with get_vars() 12033 1726867165.96183: done getting variables 12033 1726867165.96240: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 17:19:25 -0400 (0:00:00.030) 0:00:05.078 ****** 12033 1726867165.96268: entering _queue_task() for managed_node3/package 12033 1726867165.96540: worker is 1 (out of 1 available) 12033 1726867165.96551: exiting _queue_task() for managed_node3/package 12033 1726867165.96560: done queuing things up, now waiting for results queue to drain 12033 1726867165.96562: waiting for pending results... 12033 1726867165.96764: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 12033 1726867165.96852: in run() - task 0affcac9-a3a5-74bb-502b-000000000114 12033 1726867165.96887: variable 'ansible_search_path' from source: unknown 12033 1726867165.96890: variable 'ansible_search_path' from source: unknown 12033 1726867165.96925: calling self._execute() 12033 1726867165.97010: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867165.97075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867165.97081: variable 'omit' from source: magic vars 12033 1726867165.97359: variable 'ansible_distribution_major_version' from source: facts 12033 1726867165.97373: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867165.97491: variable 'ansible_os_family' from source: facts 12033 1726867165.97502: Evaluated conditional (ansible_os_family == 'RedHat'): True 12033 1726867165.97666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867165.97936: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867165.97993: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867165.98031: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867165.98076: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867165.98163: variable 'ansible_distribution_major_version' from source: facts 12033 1726867165.98183: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 12033 1726867165.98186: variable 'omit' from source: magic vars 12033 1726867165.98271: variable 'omit' from source: magic vars 12033 1726867165.98392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867166.00651: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867166.00715: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867166.00754: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867166.00799: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867166.00831: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867166.00980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867166.00984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867166.00995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867166.01042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867166.01063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867166.01159: variable '__network_is_ostree' from source: set_fact 12033 1726867166.01170: variable 'omit' from source: magic vars 12033 1726867166.01205: variable 'omit' from source: magic vars 12033 1726867166.01237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867166.01300: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867166.01304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867166.01313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867166.01325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867166.01357: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867166.01363: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867166.01369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867166.01462: Set connection var ansible_pipelining to False 12033 1726867166.01482: Set connection var ansible_shell_executable to /bin/sh 12033 1726867166.01517: Set connection var ansible_timeout to 10 12033 1726867166.01520: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867166.01523: Set connection var ansible_connection to ssh 12033 1726867166.01525: Set connection var ansible_shell_type to sh 12033 1726867166.01560: variable 'ansible_shell_executable' from source: unknown 12033 1726867166.01570: variable 'ansible_connection' from source: unknown 12033 1726867166.01579: variable 'ansible_module_compression' from source: unknown 12033 1726867166.01627: variable 'ansible_shell_type' from source: unknown 12033 1726867166.01630: variable 'ansible_shell_executable' from source: unknown 12033 1726867166.01633: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867166.01635: variable 'ansible_pipelining' from source: unknown 12033 1726867166.01638: variable 'ansible_timeout' from source: unknown 12033 1726867166.01640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867166.01720: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867166.01739: variable 'omit' from source: magic vars 12033 1726867166.01748: starting attempt loop 12033 1726867166.01754: running the handler 12033 1726867166.01783: variable 'ansible_facts' from source: unknown 12033 1726867166.01786: variable 'ansible_facts' from source: unknown 12033 1726867166.01824: _low_level_execute_command(): starting 12033 1726867166.01843: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867166.02533: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867166.02556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867166.02572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867166.02591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867166.02608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867166.02664: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867166.02724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867166.02743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867166.02770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867166.02856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867166.04519: stdout chunk (state=3): >>>/root <<< 12033 1726867166.04667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867166.04670: stdout chunk (state=3): >>><<< 12033 1726867166.04673: stderr chunk (state=3): >>><<< 12033 1726867166.04704: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867166.04722: _low_level_execute_command(): starting 12033 1726867166.04799: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867166.0471032-12261-110598135686942 `" && echo ansible-tmp-1726867166.0471032-12261-110598135686942="` echo /root/.ansible/tmp/ansible-tmp-1726867166.0471032-12261-110598135686942 `" ) && sleep 0' 12033 1726867166.05350: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867166.05364: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867166.05390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867166.05447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867166.05510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867166.05533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867166.05564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867166.05632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867166.07533: stdout chunk (state=3): >>>ansible-tmp-1726867166.0471032-12261-110598135686942=/root/.ansible/tmp/ansible-tmp-1726867166.0471032-12261-110598135686942 <<< 12033 1726867166.07703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867166.07706: stdout chunk (state=3): >>><<< 12033 1726867166.07708: stderr chunk (state=3): >>><<< 12033 1726867166.07729: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867166.0471032-12261-110598135686942=/root/.ansible/tmp/ansible-tmp-1726867166.0471032-12261-110598135686942 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867166.07882: variable 'ansible_module_compression' from source: unknown 12033 1726867166.07885: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 12033 1726867166.07890: variable 'ansible_facts' from source: unknown 12033 1726867166.08013: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867166.0471032-12261-110598135686942/AnsiballZ_dnf.py 12033 1726867166.08243: Sending initial data 12033 1726867166.08246: Sent initial data (152 bytes) 12033 1726867166.08729: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867166.08742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867166.08748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867166.08844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867166.08861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867166.08934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867166.10490: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867166.10548: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867166.10598: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmp0x13bhvi /root/.ansible/tmp/ansible-tmp-1726867166.0471032-12261-110598135686942/AnsiballZ_dnf.py <<< 12033 1726867166.10604: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867166.0471032-12261-110598135686942/AnsiballZ_dnf.py" <<< 12033 1726867166.10658: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmp0x13bhvi" to remote "/root/.ansible/tmp/ansible-tmp-1726867166.0471032-12261-110598135686942/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867166.0471032-12261-110598135686942/AnsiballZ_dnf.py" <<< 12033 1726867166.11676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867166.11790: stderr chunk (state=3): >>><<< 12033 1726867166.11793: stdout chunk (state=3): >>><<< 12033 1726867166.11795: done transferring module to remote 12033 1726867166.11797: _low_level_execute_command(): starting 12033 1726867166.11800: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867166.0471032-12261-110598135686942/ /root/.ansible/tmp/ansible-tmp-1726867166.0471032-12261-110598135686942/AnsiballZ_dnf.py && sleep 0' 12033 1726867166.12336: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867166.12349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867166.12365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867166.12386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867166.12440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867166.12504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867166.12558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867166.12608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867166.14451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867166.14455: stdout chunk (state=3): >>><<< 12033 1726867166.14457: stderr chunk (state=3): >>><<< 12033 1726867166.14460: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867166.14462: _low_level_execute_command(): starting 12033 1726867166.14464: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867166.0471032-12261-110598135686942/AnsiballZ_dnf.py && sleep 0' 12033 1726867166.14957: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867166.14972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867166.14989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867166.15007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867166.15026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867166.15095: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867166.15129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867166.15143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867166.15252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867166.15309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867166.56309: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 12033 1726867166.70605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867166.70610: stdout chunk (state=3): >>><<< 12033 1726867166.70613: stderr chunk (state=3): >>><<< 12033 1726867166.70615: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867166.70623: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867166.0471032-12261-110598135686942/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867166.70626: _low_level_execute_command(): starting 12033 1726867166.70628: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867166.0471032-12261-110598135686942/ > /dev/null 2>&1 && sleep 0' 12033 1726867166.71518: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867166.71532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867166.71543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867166.71618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867166.71675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867166.73699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867166.73733: stderr chunk (state=3): >>><<< 12033 1726867166.73736: stdout chunk (state=3): >>><<< 12033 1726867166.73869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867166.73878: handler run complete 12033 1726867166.73945: attempt loop complete, returning result 12033 1726867166.73948: _execute() done 12033 1726867166.73950: dumping result to json 12033 1726867166.73952: done dumping result, returning 12033 1726867166.73954: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0affcac9-a3a5-74bb-502b-000000000114] 12033 1726867166.73956: sending task result for task 0affcac9-a3a5-74bb-502b-000000000114 12033 1726867166.74162: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000114 12033 1726867166.74165: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 12033 1726867166.74241: no more pending results, returning what we have 12033 1726867166.74244: results queue empty 12033 1726867166.74245: checking for any_errors_fatal 12033 1726867166.74250: done checking for any_errors_fatal 12033 1726867166.74250: checking for max_fail_percentage 12033 1726867166.74252: done checking for max_fail_percentage 12033 1726867166.74273: checking to see if all hosts have failed and the running result is not ok 12033 1726867166.74275: done checking to see if all hosts have failed 12033 1726867166.74276: getting the remaining hosts for this loop 12033 1726867166.74279: done getting the remaining hosts for this loop 12033 1726867166.74282: getting the next task for host managed_node3 12033 1726867166.74288: done getting next task for host managed_node3 12033 1726867166.74290: ^ task is: TASK: Create test interfaces 12033 1726867166.74297: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867166.74371: getting variables 12033 1726867166.74373: in VariableManager get_vars() 12033 1726867166.74401: Calling all_inventory to load vars for managed_node3 12033 1726867166.74405: Calling groups_inventory to load vars for managed_node3 12033 1726867166.74408: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867166.74417: Calling all_plugins_play to load vars for managed_node3 12033 1726867166.74420: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867166.74423: Calling groups_plugins_play to load vars for managed_node3 12033 1726867166.74622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867166.74835: done with get_vars() 12033 1726867166.74845: done getting variables 12033 1726867166.74945: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 17:19:26 -0400 (0:00:00.787) 0:00:05.865 ****** 12033 1726867166.74975: entering _queue_task() for managed_node3/shell 12033 1726867166.74984: Creating lock for shell 12033 1726867166.75383: worker is 1 (out of 1 available) 12033 1726867166.75393: exiting _queue_task() for managed_node3/shell 12033 1726867166.75403: done queuing things up, now waiting for results queue to drain 12033 1726867166.75404: waiting for pending results... 12033 1726867166.75641: running TaskExecutor() for managed_node3/TASK: Create test interfaces 12033 1726867166.75690: in run() - task 0affcac9-a3a5-74bb-502b-000000000115 12033 1726867166.75711: variable 'ansible_search_path' from source: unknown 12033 1726867166.75719: variable 'ansible_search_path' from source: unknown 12033 1726867166.75787: calling self._execute() 12033 1726867166.75849: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867166.75860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867166.75883: variable 'omit' from source: magic vars 12033 1726867166.76330: variable 'ansible_distribution_major_version' from source: facts 12033 1726867166.76334: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867166.76336: variable 'omit' from source: magic vars 12033 1726867166.76338: variable 'omit' from source: magic vars 12033 1726867166.76814: variable 'dhcp_interface1' from source: play vars 12033 1726867166.76825: variable 'dhcp_interface2' from source: play vars 12033 1726867166.76852: variable 'omit' from source: magic vars 12033 1726867166.76904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867166.76943: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867166.76968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867166.77082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867166.77088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867166.77090: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867166.77092: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867166.77096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867166.77169: Set connection var ansible_pipelining to False 12033 1726867166.77187: Set connection var ansible_shell_executable to /bin/sh 12033 1726867166.77200: Set connection var ansible_timeout to 10 12033 1726867166.77219: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867166.77322: Set connection var ansible_connection to ssh 12033 1726867166.77325: Set connection var ansible_shell_type to sh 12033 1726867166.77327: variable 'ansible_shell_executable' from source: unknown 12033 1726867166.77329: variable 'ansible_connection' from source: unknown 12033 1726867166.77331: variable 'ansible_module_compression' from source: unknown 12033 1726867166.77333: variable 'ansible_shell_type' from source: unknown 12033 1726867166.77335: variable 'ansible_shell_executable' from source: unknown 12033 1726867166.77337: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867166.77339: variable 'ansible_pipelining' from source: unknown 12033 1726867166.77340: variable 'ansible_timeout' from source: unknown 12033 1726867166.77343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867166.77457: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867166.77474: variable 'omit' from source: magic vars 12033 1726867166.77486: starting attempt loop 12033 1726867166.77493: running the handler 12033 1726867166.77505: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867166.77539: _low_level_execute_command(): starting 12033 1726867166.77545: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867166.78303: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867166.78315: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867166.78393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867166.78443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867166.78489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867166.78541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867166.80219: stdout chunk (state=3): >>>/root <<< 12033 1726867166.80354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867166.80367: stdout chunk (state=3): >>><<< 12033 1726867166.80386: stderr chunk (state=3): >>><<< 12033 1726867166.80414: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867166.80503: _low_level_execute_command(): starting 12033 1726867166.80508: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867166.8042073-12294-199438269269441 `" && echo ansible-tmp-1726867166.8042073-12294-199438269269441="` echo /root/.ansible/tmp/ansible-tmp-1726867166.8042073-12294-199438269269441 `" ) && sleep 0' 12033 1726867166.81046: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867166.81094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867166.81116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867166.81208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867166.81246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867166.81290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867166.83225: stdout chunk (state=3): >>>ansible-tmp-1726867166.8042073-12294-199438269269441=/root/.ansible/tmp/ansible-tmp-1726867166.8042073-12294-199438269269441 <<< 12033 1726867166.83383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867166.83386: stdout chunk (state=3): >>><<< 12033 1726867166.83388: stderr chunk (state=3): >>><<< 12033 1726867166.83403: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867166.8042073-12294-199438269269441=/root/.ansible/tmp/ansible-tmp-1726867166.8042073-12294-199438269269441 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867166.83588: variable 'ansible_module_compression' from source: unknown 12033 1726867166.83592: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867166.83594: variable 'ansible_facts' from source: unknown 12033 1726867166.83615: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867166.8042073-12294-199438269269441/AnsiballZ_command.py 12033 1726867166.83797: Sending initial data 12033 1726867166.83812: Sent initial data (156 bytes) 12033 1726867166.84456: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867166.84513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867166.84546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867166.84549: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867166.84551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867166.84617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867166.86196: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 12033 1726867166.86206: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867166.86242: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867166.86283: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpmndgzfvg /root/.ansible/tmp/ansible-tmp-1726867166.8042073-12294-199438269269441/AnsiballZ_command.py <<< 12033 1726867166.86295: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867166.8042073-12294-199438269269441/AnsiballZ_command.py" <<< 12033 1726867166.86327: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpmndgzfvg" to remote "/root/.ansible/tmp/ansible-tmp-1726867166.8042073-12294-199438269269441/AnsiballZ_command.py" <<< 12033 1726867166.86335: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867166.8042073-12294-199438269269441/AnsiballZ_command.py" <<< 12033 1726867166.87125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867166.87164: stderr chunk (state=3): >>><<< 12033 1726867166.87237: stdout chunk (state=3): >>><<< 12033 1726867166.87246: done transferring module to remote 12033 1726867166.87269: _low_level_execute_command(): starting 12033 1726867166.87285: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867166.8042073-12294-199438269269441/ /root/.ansible/tmp/ansible-tmp-1726867166.8042073-12294-199438269269441/AnsiballZ_command.py && sleep 0' 12033 1726867166.87869: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867166.87901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867166.87912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867166.87918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867166.87947: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867166.87960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867166.88043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867166.88057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867166.88118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867166.89901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867166.89921: stderr chunk (state=3): >>><<< 12033 1726867166.89924: stdout chunk (state=3): >>><<< 12033 1726867166.89939: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867166.89942: _low_level_execute_command(): starting 12033 1726867166.89947: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867166.8042073-12294-199438269269441/AnsiballZ_command.py && sleep 0' 12033 1726867166.90352: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867166.90356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867166.90359: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867166.90362: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867166.90364: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867166.90410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867166.90416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867166.90474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867168.27907: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 702 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 702 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 17:19:27.054351", "end": "2024-09-20 17:19:28.276138", "delta": "0:00:01.221787", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867168.29494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867168.29524: stderr chunk (state=3): >>><<< 12033 1726867168.29527: stdout chunk (state=3): >>><<< 12033 1726867168.29548: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 702 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 702 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 17:19:27.054351", "end": "2024-09-20 17:19:28.276138", "delta": "0:00:01.221787", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867168.29589: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867166.8042073-12294-199438269269441/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867168.29599: _low_level_execute_command(): starting 12033 1726867168.29603: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867166.8042073-12294-199438269269441/ > /dev/null 2>&1 && sleep 0' 12033 1726867168.30065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867168.30068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867168.30070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867168.30072: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867168.30075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867168.30122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867168.30129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867168.30131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867168.30176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867168.32011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867168.32031: stderr chunk (state=3): >>><<< 12033 1726867168.32035: stdout chunk (state=3): >>><<< 12033 1726867168.32046: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867168.32052: handler run complete 12033 1726867168.32069: Evaluated conditional (False): False 12033 1726867168.32079: attempt loop complete, returning result 12033 1726867168.32082: _execute() done 12033 1726867168.32085: dumping result to json 12033 1726867168.32093: done dumping result, returning 12033 1726867168.32099: done running TaskExecutor() for managed_node3/TASK: Create test interfaces [0affcac9-a3a5-74bb-502b-000000000115] 12033 1726867168.32108: sending task result for task 0affcac9-a3a5-74bb-502b-000000000115 12033 1726867168.32212: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000115 12033 1726867168.32214: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.221787", "end": "2024-09-20 17:19:28.276138", "rc": 0, "start": "2024-09-20 17:19:27.054351" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 702 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 702 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 12033 1726867168.32295: no more pending results, returning what we have 12033 1726867168.32300: results queue empty 12033 1726867168.32301: checking for any_errors_fatal 12033 1726867168.32308: done checking for any_errors_fatal 12033 1726867168.32308: checking for max_fail_percentage 12033 1726867168.32310: done checking for max_fail_percentage 12033 1726867168.32311: checking to see if all hosts have failed and the running result is not ok 12033 1726867168.32312: done checking to see if all hosts have failed 12033 1726867168.32312: getting the remaining hosts for this loop 12033 1726867168.32314: done getting the remaining hosts for this loop 12033 1726867168.32318: getting the next task for host managed_node3 12033 1726867168.32328: done getting next task for host managed_node3 12033 1726867168.32331: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12033 1726867168.32335: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867168.32338: getting variables 12033 1726867168.32339: in VariableManager get_vars() 12033 1726867168.32363: Calling all_inventory to load vars for managed_node3 12033 1726867168.32366: Calling groups_inventory to load vars for managed_node3 12033 1726867168.32368: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867168.32384: Calling all_plugins_play to load vars for managed_node3 12033 1726867168.32387: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867168.32392: Calling groups_plugins_play to load vars for managed_node3 12033 1726867168.32551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867168.32668: done with get_vars() 12033 1726867168.32675: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:19:28 -0400 (0:00:01.577) 0:00:07.443 ****** 12033 1726867168.32747: entering _queue_task() for managed_node3/include_tasks 12033 1726867168.32949: worker is 1 (out of 1 available) 12033 1726867168.32962: exiting _queue_task() for managed_node3/include_tasks 12033 1726867168.32971: done queuing things up, now waiting for results queue to drain 12033 1726867168.32973: waiting for pending results... 12033 1726867168.33137: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 12033 1726867168.33212: in run() - task 0affcac9-a3a5-74bb-502b-00000000011c 12033 1726867168.33217: variable 'ansible_search_path' from source: unknown 12033 1726867168.33220: variable 'ansible_search_path' from source: unknown 12033 1726867168.33249: calling self._execute() 12033 1726867168.33484: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867168.33487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867168.33493: variable 'omit' from source: magic vars 12033 1726867168.33731: variable 'ansible_distribution_major_version' from source: facts 12033 1726867168.33748: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867168.33759: _execute() done 12033 1726867168.33765: dumping result to json 12033 1726867168.33771: done dumping result, returning 12033 1726867168.33784: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-74bb-502b-00000000011c] 12033 1726867168.33798: sending task result for task 0affcac9-a3a5-74bb-502b-00000000011c 12033 1726867168.33898: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000011c 12033 1726867168.33906: WORKER PROCESS EXITING 12033 1726867168.33934: no more pending results, returning what we have 12033 1726867168.33938: in VariableManager get_vars() 12033 1726867168.33970: Calling all_inventory to load vars for managed_node3 12033 1726867168.33972: Calling groups_inventory to load vars for managed_node3 12033 1726867168.33975: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867168.33992: Calling all_plugins_play to load vars for managed_node3 12033 1726867168.33995: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867168.33998: Calling groups_plugins_play to load vars for managed_node3 12033 1726867168.34362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867168.34531: done with get_vars() 12033 1726867168.34537: variable 'ansible_search_path' from source: unknown 12033 1726867168.34537: variable 'ansible_search_path' from source: unknown 12033 1726867168.34571: we have included files to process 12033 1726867168.34572: generating all_blocks data 12033 1726867168.34574: done generating all_blocks data 12033 1726867168.34580: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12033 1726867168.34581: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12033 1726867168.34582: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12033 1726867168.34739: done processing included file 12033 1726867168.34740: iterating over new_blocks loaded from include file 12033 1726867168.34741: in VariableManager get_vars() 12033 1726867168.34751: done with get_vars() 12033 1726867168.34752: filtering new block on tags 12033 1726867168.34769: done filtering new block on tags 12033 1726867168.34771: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 12033 1726867168.34775: extending task lists for all hosts with included blocks 12033 1726867168.34926: done extending task lists 12033 1726867168.34927: done processing included files 12033 1726867168.34928: results queue empty 12033 1726867168.34928: checking for any_errors_fatal 12033 1726867168.34931: done checking for any_errors_fatal 12033 1726867168.34932: checking for max_fail_percentage 12033 1726867168.34932: done checking for max_fail_percentage 12033 1726867168.34933: checking to see if all hosts have failed and the running result is not ok 12033 1726867168.34933: done checking to see if all hosts have failed 12033 1726867168.34934: getting the remaining hosts for this loop 12033 1726867168.34935: done getting the remaining hosts for this loop 12033 1726867168.34936: getting the next task for host managed_node3 12033 1726867168.34939: done getting next task for host managed_node3 12033 1726867168.34940: ^ task is: TASK: Get stat for interface {{ interface }} 12033 1726867168.34943: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867168.34944: getting variables 12033 1726867168.34945: in VariableManager get_vars() 12033 1726867168.34950: Calling all_inventory to load vars for managed_node3 12033 1726867168.34951: Calling groups_inventory to load vars for managed_node3 12033 1726867168.34953: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867168.34956: Calling all_plugins_play to load vars for managed_node3 12033 1726867168.34957: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867168.34959: Calling groups_plugins_play to load vars for managed_node3 12033 1726867168.35039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867168.35146: done with get_vars() 12033 1726867168.35152: done getting variables 12033 1726867168.35273: variable 'interface' from source: task vars 12033 1726867168.35280: variable 'dhcp_interface1' from source: play vars 12033 1726867168.35342: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:19:28 -0400 (0:00:00.026) 0:00:07.469 ****** 12033 1726867168.35374: entering _queue_task() for managed_node3/stat 12033 1726867168.35624: worker is 1 (out of 1 available) 12033 1726867168.35637: exiting _queue_task() for managed_node3/stat 12033 1726867168.35649: done queuing things up, now waiting for results queue to drain 12033 1726867168.35651: waiting for pending results... 12033 1726867168.35899: running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 12033 1726867168.36004: in run() - task 0affcac9-a3a5-74bb-502b-00000000017b 12033 1726867168.36031: variable 'ansible_search_path' from source: unknown 12033 1726867168.36040: variable 'ansible_search_path' from source: unknown 12033 1726867168.36080: calling self._execute() 12033 1726867168.36168: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867168.36182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867168.36213: variable 'omit' from source: magic vars 12033 1726867168.36672: variable 'ansible_distribution_major_version' from source: facts 12033 1726867168.36695: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867168.36975: variable 'omit' from source: magic vars 12033 1726867168.36981: variable 'omit' from source: magic vars 12033 1726867168.37039: variable 'interface' from source: task vars 12033 1726867168.37048: variable 'dhcp_interface1' from source: play vars 12033 1726867168.37111: variable 'dhcp_interface1' from source: play vars 12033 1726867168.37135: variable 'omit' from source: magic vars 12033 1726867168.37175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867168.37215: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867168.37237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867168.37258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867168.37272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867168.37309: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867168.37318: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867168.37326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867168.37424: Set connection var ansible_pipelining to False 12033 1726867168.37582: Set connection var ansible_shell_executable to /bin/sh 12033 1726867168.37586: Set connection var ansible_timeout to 10 12033 1726867168.37588: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867168.37590: Set connection var ansible_connection to ssh 12033 1726867168.37592: Set connection var ansible_shell_type to sh 12033 1726867168.37594: variable 'ansible_shell_executable' from source: unknown 12033 1726867168.37596: variable 'ansible_connection' from source: unknown 12033 1726867168.37598: variable 'ansible_module_compression' from source: unknown 12033 1726867168.37600: variable 'ansible_shell_type' from source: unknown 12033 1726867168.37602: variable 'ansible_shell_executable' from source: unknown 12033 1726867168.37603: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867168.37605: variable 'ansible_pipelining' from source: unknown 12033 1726867168.37607: variable 'ansible_timeout' from source: unknown 12033 1726867168.37610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867168.37714: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867168.37730: variable 'omit' from source: magic vars 12033 1726867168.37741: starting attempt loop 12033 1726867168.37747: running the handler 12033 1726867168.37764: _low_level_execute_command(): starting 12033 1726867168.37775: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867168.38423: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867168.38439: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867168.38452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867168.38469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867168.38567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867168.38588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867168.38605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867168.38701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867168.40493: stdout chunk (state=3): >>>/root <<< 12033 1726867168.40585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867168.40591: stdout chunk (state=3): >>><<< 12033 1726867168.40594: stderr chunk (state=3): >>><<< 12033 1726867168.40599: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867168.40613: _low_level_execute_command(): starting 12033 1726867168.40619: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867168.4059763-12347-160882825731022 `" && echo ansible-tmp-1726867168.4059763-12347-160882825731022="` echo /root/.ansible/tmp/ansible-tmp-1726867168.4059763-12347-160882825731022 `" ) && sleep 0' 12033 1726867168.41706: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867168.41710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867168.41798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867168.41865: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867168.41880: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867168.41931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867168.42043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867168.42105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867168.44011: stdout chunk (state=3): >>>ansible-tmp-1726867168.4059763-12347-160882825731022=/root/.ansible/tmp/ansible-tmp-1726867168.4059763-12347-160882825731022 <<< 12033 1726867168.44172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867168.44175: stdout chunk (state=3): >>><<< 12033 1726867168.44179: stderr chunk (state=3): >>><<< 12033 1726867168.44197: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867168.4059763-12347-160882825731022=/root/.ansible/tmp/ansible-tmp-1726867168.4059763-12347-160882825731022 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867168.44250: variable 'ansible_module_compression' from source: unknown 12033 1726867168.44321: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12033 1726867168.44364: variable 'ansible_facts' from source: unknown 12033 1726867168.44460: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867168.4059763-12347-160882825731022/AnsiballZ_stat.py 12033 1726867168.44600: Sending initial data 12033 1726867168.44684: Sent initial data (153 bytes) 12033 1726867168.45246: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867168.45301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867168.45321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867168.45336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867168.45440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867168.46978: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867168.47036: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867168.47098: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmp6yvrurtb /root/.ansible/tmp/ansible-tmp-1726867168.4059763-12347-160882825731022/AnsiballZ_stat.py <<< 12033 1726867168.47124: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867168.4059763-12347-160882825731022/AnsiballZ_stat.py" <<< 12033 1726867168.47166: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmp6yvrurtb" to remote "/root/.ansible/tmp/ansible-tmp-1726867168.4059763-12347-160882825731022/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867168.4059763-12347-160882825731022/AnsiballZ_stat.py" <<< 12033 1726867168.47892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867168.47922: stderr chunk (state=3): >>><<< 12033 1726867168.48032: stdout chunk (state=3): >>><<< 12033 1726867168.48037: done transferring module to remote 12033 1726867168.48039: _low_level_execute_command(): starting 12033 1726867168.48042: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867168.4059763-12347-160882825731022/ /root/.ansible/tmp/ansible-tmp-1726867168.4059763-12347-160882825731022/AnsiballZ_stat.py && sleep 0' 12033 1726867168.48844: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867168.48847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867168.48864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867168.48962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867168.48967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867168.48973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867168.49172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867168.49225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867168.49264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867168.51002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867168.51047: stderr chunk (state=3): >>><<< 12033 1726867168.51060: stdout chunk (state=3): >>><<< 12033 1726867168.51080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867168.51090: _low_level_execute_command(): starting 12033 1726867168.51100: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867168.4059763-12347-160882825731022/AnsiballZ_stat.py && sleep 0' 12033 1726867168.51670: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867168.51691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867168.51710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867168.51732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867168.51748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867168.51758: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867168.51795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867168.51869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867168.51892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867168.51926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867168.52040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867168.67276: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27482, "dev": 23, "nlink": 1, "atime": 1726867167.061087, "mtime": 1726867167.061087, "ctime": 1726867167.061087, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12033 1726867168.68535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867168.68557: stdout chunk (state=3): >>><<< 12033 1726867168.68570: stderr chunk (state=3): >>><<< 12033 1726867168.68785: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27482, "dev": 23, "nlink": 1, "atime": 1726867167.061087, "mtime": 1726867167.061087, "ctime": 1726867167.061087, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867168.68788: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867168.4059763-12347-160882825731022/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867168.68795: _low_level_execute_command(): starting 12033 1726867168.68797: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867168.4059763-12347-160882825731022/ > /dev/null 2>&1 && sleep 0' 12033 1726867168.69630: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867168.69646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867168.69693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867168.69707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867168.69718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867168.69801: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867168.69817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867168.69838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867168.69914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867168.71815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867168.71825: stdout chunk (state=3): >>><<< 12033 1726867168.71837: stderr chunk (state=3): >>><<< 12033 1726867168.71857: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867168.71869: handler run complete 12033 1726867168.71930: attempt loop complete, returning result 12033 1726867168.71983: _execute() done 12033 1726867168.71987: dumping result to json 12033 1726867168.71989: done dumping result, returning 12033 1726867168.71992: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 [0affcac9-a3a5-74bb-502b-00000000017b] 12033 1726867168.71993: sending task result for task 0affcac9-a3a5-74bb-502b-00000000017b ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726867167.061087, "block_size": 4096, "blocks": 0, "ctime": 1726867167.061087, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27482, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726867167.061087, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12033 1726867168.72199: no more pending results, returning what we have 12033 1726867168.72203: results queue empty 12033 1726867168.72205: checking for any_errors_fatal 12033 1726867168.72206: done checking for any_errors_fatal 12033 1726867168.72207: checking for max_fail_percentage 12033 1726867168.72208: done checking for max_fail_percentage 12033 1726867168.72209: checking to see if all hosts have failed and the running result is not ok 12033 1726867168.72212: done checking to see if all hosts have failed 12033 1726867168.72213: getting the remaining hosts for this loop 12033 1726867168.72214: done getting the remaining hosts for this loop 12033 1726867168.72218: getting the next task for host managed_node3 12033 1726867168.72227: done getting next task for host managed_node3 12033 1726867168.72229: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12033 1726867168.72233: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867168.72237: getting variables 12033 1726867168.72239: in VariableManager get_vars() 12033 1726867168.72268: Calling all_inventory to load vars for managed_node3 12033 1726867168.72271: Calling groups_inventory to load vars for managed_node3 12033 1726867168.72274: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867168.72518: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000017b 12033 1726867168.72521: WORKER PROCESS EXITING 12033 1726867168.72532: Calling all_plugins_play to load vars for managed_node3 12033 1726867168.72536: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867168.72539: Calling groups_plugins_play to load vars for managed_node3 12033 1726867168.73094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867168.73303: done with get_vars() 12033 1726867168.73313: done getting variables 12033 1726867168.73412: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 12033 1726867168.73533: variable 'interface' from source: task vars 12033 1726867168.73537: variable 'dhcp_interface1' from source: play vars 12033 1726867168.73603: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:19:28 -0400 (0:00:00.382) 0:00:07.852 ****** 12033 1726867168.73635: entering _queue_task() for managed_node3/assert 12033 1726867168.73637: Creating lock for assert 12033 1726867168.73900: worker is 1 (out of 1 available) 12033 1726867168.73912: exiting _queue_task() for managed_node3/assert 12033 1726867168.74040: done queuing things up, now waiting for results queue to drain 12033 1726867168.74042: waiting for pending results... 12033 1726867168.74189: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' 12033 1726867168.74326: in run() - task 0affcac9-a3a5-74bb-502b-00000000011d 12033 1726867168.74346: variable 'ansible_search_path' from source: unknown 12033 1726867168.74353: variable 'ansible_search_path' from source: unknown 12033 1726867168.74402: calling self._execute() 12033 1726867168.74490: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867168.74503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867168.74517: variable 'omit' from source: magic vars 12033 1726867168.74894: variable 'ansible_distribution_major_version' from source: facts 12033 1726867168.74921: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867168.74932: variable 'omit' from source: magic vars 12033 1726867168.74997: variable 'omit' from source: magic vars 12033 1726867168.75101: variable 'interface' from source: task vars 12033 1726867168.75110: variable 'dhcp_interface1' from source: play vars 12033 1726867168.75183: variable 'dhcp_interface1' from source: play vars 12033 1726867168.75208: variable 'omit' from source: magic vars 12033 1726867168.75261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867168.75311: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867168.75334: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867168.75367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867168.75462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867168.75465: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867168.75469: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867168.75471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867168.75535: Set connection var ansible_pipelining to False 12033 1726867168.75552: Set connection var ansible_shell_executable to /bin/sh 12033 1726867168.75579: Set connection var ansible_timeout to 10 12033 1726867168.75590: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867168.75597: Set connection var ansible_connection to ssh 12033 1726867168.75607: Set connection var ansible_shell_type to sh 12033 1726867168.75631: variable 'ansible_shell_executable' from source: unknown 12033 1726867168.75639: variable 'ansible_connection' from source: unknown 12033 1726867168.75645: variable 'ansible_module_compression' from source: unknown 12033 1726867168.75651: variable 'ansible_shell_type' from source: unknown 12033 1726867168.75657: variable 'ansible_shell_executable' from source: unknown 12033 1726867168.75662: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867168.75669: variable 'ansible_pipelining' from source: unknown 12033 1726867168.75787: variable 'ansible_timeout' from source: unknown 12033 1726867168.75792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867168.75845: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867168.75860: variable 'omit' from source: magic vars 12033 1726867168.75869: starting attempt loop 12033 1726867168.75875: running the handler 12033 1726867168.76025: variable 'interface_stat' from source: set_fact 12033 1726867168.76051: Evaluated conditional (interface_stat.stat.exists): True 12033 1726867168.76063: handler run complete 12033 1726867168.76084: attempt loop complete, returning result 12033 1726867168.76113: _execute() done 12033 1726867168.76120: dumping result to json 12033 1726867168.76126: done dumping result, returning 12033 1726867168.76129: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' [0affcac9-a3a5-74bb-502b-00000000011d] 12033 1726867168.76223: sending task result for task 0affcac9-a3a5-74bb-502b-00000000011d 12033 1726867168.76402: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000011d 12033 1726867168.76405: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12033 1726867168.76444: no more pending results, returning what we have 12033 1726867168.76447: results queue empty 12033 1726867168.76448: checking for any_errors_fatal 12033 1726867168.76455: done checking for any_errors_fatal 12033 1726867168.76456: checking for max_fail_percentage 12033 1726867168.76457: done checking for max_fail_percentage 12033 1726867168.76458: checking to see if all hosts have failed and the running result is not ok 12033 1726867168.76459: done checking to see if all hosts have failed 12033 1726867168.76460: getting the remaining hosts for this loop 12033 1726867168.76461: done getting the remaining hosts for this loop 12033 1726867168.76464: getting the next task for host managed_node3 12033 1726867168.76471: done getting next task for host managed_node3 12033 1726867168.76474: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12033 1726867168.76486: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867168.76489: getting variables 12033 1726867168.76491: in VariableManager get_vars() 12033 1726867168.76514: Calling all_inventory to load vars for managed_node3 12033 1726867168.76516: Calling groups_inventory to load vars for managed_node3 12033 1726867168.76519: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867168.76528: Calling all_plugins_play to load vars for managed_node3 12033 1726867168.76530: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867168.76533: Calling groups_plugins_play to load vars for managed_node3 12033 1726867168.76775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867168.76997: done with get_vars() 12033 1726867168.77007: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:19:28 -0400 (0:00:00.034) 0:00:07.886 ****** 12033 1726867168.77104: entering _queue_task() for managed_node3/include_tasks 12033 1726867168.77349: worker is 1 (out of 1 available) 12033 1726867168.77585: exiting _queue_task() for managed_node3/include_tasks 12033 1726867168.77595: done queuing things up, now waiting for results queue to drain 12033 1726867168.77596: waiting for pending results... 12033 1726867168.77767: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 12033 1726867168.77986: in run() - task 0affcac9-a3a5-74bb-502b-000000000121 12033 1726867168.77992: variable 'ansible_search_path' from source: unknown 12033 1726867168.77995: variable 'ansible_search_path' from source: unknown 12033 1726867168.78210: calling self._execute() 12033 1726867168.78317: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867168.78330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867168.78426: variable 'omit' from source: magic vars 12033 1726867168.79002: variable 'ansible_distribution_major_version' from source: facts 12033 1726867168.79011: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867168.79016: _execute() done 12033 1726867168.79019: dumping result to json 12033 1726867168.79022: done dumping result, returning 12033 1726867168.79033: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-74bb-502b-000000000121] 12033 1726867168.79036: sending task result for task 0affcac9-a3a5-74bb-502b-000000000121 12033 1726867168.79119: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000121 12033 1726867168.79122: WORKER PROCESS EXITING 12033 1726867168.79159: no more pending results, returning what we have 12033 1726867168.79164: in VariableManager get_vars() 12033 1726867168.79195: Calling all_inventory to load vars for managed_node3 12033 1726867168.79198: Calling groups_inventory to load vars for managed_node3 12033 1726867168.79200: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867168.79210: Calling all_plugins_play to load vars for managed_node3 12033 1726867168.79212: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867168.79214: Calling groups_plugins_play to load vars for managed_node3 12033 1726867168.79372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867168.79482: done with get_vars() 12033 1726867168.79488: variable 'ansible_search_path' from source: unknown 12033 1726867168.79489: variable 'ansible_search_path' from source: unknown 12033 1726867168.79511: we have included files to process 12033 1726867168.79512: generating all_blocks data 12033 1726867168.79513: done generating all_blocks data 12033 1726867168.79515: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12033 1726867168.79516: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12033 1726867168.79517: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12033 1726867168.79634: done processing included file 12033 1726867168.79636: iterating over new_blocks loaded from include file 12033 1726867168.79637: in VariableManager get_vars() 12033 1726867168.79646: done with get_vars() 12033 1726867168.79647: filtering new block on tags 12033 1726867168.79664: done filtering new block on tags 12033 1726867168.79665: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 12033 1726867168.79668: extending task lists for all hosts with included blocks 12033 1726867168.79792: done extending task lists 12033 1726867168.79793: done processing included files 12033 1726867168.79794: results queue empty 12033 1726867168.79794: checking for any_errors_fatal 12033 1726867168.79796: done checking for any_errors_fatal 12033 1726867168.79797: checking for max_fail_percentage 12033 1726867168.79797: done checking for max_fail_percentage 12033 1726867168.79798: checking to see if all hosts have failed and the running result is not ok 12033 1726867168.79799: done checking to see if all hosts have failed 12033 1726867168.79799: getting the remaining hosts for this loop 12033 1726867168.79800: done getting the remaining hosts for this loop 12033 1726867168.79801: getting the next task for host managed_node3 12033 1726867168.79804: done getting next task for host managed_node3 12033 1726867168.79805: ^ task is: TASK: Get stat for interface {{ interface }} 12033 1726867168.79809: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867168.79811: getting variables 12033 1726867168.79812: in VariableManager get_vars() 12033 1726867168.79818: Calling all_inventory to load vars for managed_node3 12033 1726867168.79819: Calling groups_inventory to load vars for managed_node3 12033 1726867168.79820: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867168.79824: Calling all_plugins_play to load vars for managed_node3 12033 1726867168.79825: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867168.79826: Calling groups_plugins_play to load vars for managed_node3 12033 1726867168.79905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867168.80015: done with get_vars() 12033 1726867168.80021: done getting variables 12033 1726867168.80121: variable 'interface' from source: task vars 12033 1726867168.80124: variable 'dhcp_interface2' from source: play vars 12033 1726867168.80163: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:19:28 -0400 (0:00:00.030) 0:00:07.917 ****** 12033 1726867168.80187: entering _queue_task() for managed_node3/stat 12033 1726867168.80354: worker is 1 (out of 1 available) 12033 1726867168.80367: exiting _queue_task() for managed_node3/stat 12033 1726867168.80381: done queuing things up, now waiting for results queue to drain 12033 1726867168.80383: waiting for pending results... 12033 1726867168.80693: running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 12033 1726867168.80757: in run() - task 0affcac9-a3a5-74bb-502b-00000000019f 12033 1726867168.80857: variable 'ansible_search_path' from source: unknown 12033 1726867168.81183: variable 'ansible_search_path' from source: unknown 12033 1726867168.81189: calling self._execute() 12033 1726867168.81383: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867168.81387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867168.81389: variable 'omit' from source: magic vars 12033 1726867168.81887: variable 'ansible_distribution_major_version' from source: facts 12033 1726867168.82160: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867168.82163: variable 'omit' from source: magic vars 12033 1726867168.82165: variable 'omit' from source: magic vars 12033 1726867168.82250: variable 'interface' from source: task vars 12033 1726867168.82386: variable 'dhcp_interface2' from source: play vars 12033 1726867168.82451: variable 'dhcp_interface2' from source: play vars 12033 1726867168.82505: variable 'omit' from source: magic vars 12033 1726867168.82624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867168.82663: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867168.82719: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867168.82882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867168.82885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867168.82887: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867168.82890: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867168.82892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867168.83058: Set connection var ansible_pipelining to False 12033 1726867168.83113: Set connection var ansible_shell_executable to /bin/sh 12033 1726867168.83127: Set connection var ansible_timeout to 10 12033 1726867168.83146: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867168.83153: Set connection var ansible_connection to ssh 12033 1726867168.83162: Set connection var ansible_shell_type to sh 12033 1726867168.83193: variable 'ansible_shell_executable' from source: unknown 12033 1726867168.83204: variable 'ansible_connection' from source: unknown 12033 1726867168.83217: variable 'ansible_module_compression' from source: unknown 12033 1726867168.83226: variable 'ansible_shell_type' from source: unknown 12033 1726867168.83234: variable 'ansible_shell_executable' from source: unknown 12033 1726867168.83301: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867168.83304: variable 'ansible_pipelining' from source: unknown 12033 1726867168.83307: variable 'ansible_timeout' from source: unknown 12033 1726867168.83309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867168.83464: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867168.83481: variable 'omit' from source: magic vars 12033 1726867168.83491: starting attempt loop 12033 1726867168.83498: running the handler 12033 1726867168.83514: _low_level_execute_command(): starting 12033 1726867168.83524: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867168.84201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867168.84280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867168.84291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867168.84307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867168.84333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867168.84404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867168.86098: stdout chunk (state=3): >>>/root <<< 12033 1726867168.86202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867168.86251: stderr chunk (state=3): >>><<< 12033 1726867168.86276: stdout chunk (state=3): >>><<< 12033 1726867168.86320: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867168.86435: _low_level_execute_command(): starting 12033 1726867168.86439: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867168.863285-12381-94856979639405 `" && echo ansible-tmp-1726867168.863285-12381-94856979639405="` echo /root/.ansible/tmp/ansible-tmp-1726867168.863285-12381-94856979639405 `" ) && sleep 0' 12033 1726867168.87019: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867168.87033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867168.87058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867168.87076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867168.87183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867168.87206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867168.87219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867168.87240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867168.87345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867168.89311: stdout chunk (state=3): >>>ansible-tmp-1726867168.863285-12381-94856979639405=/root/.ansible/tmp/ansible-tmp-1726867168.863285-12381-94856979639405 <<< 12033 1726867168.89453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867168.89464: stdout chunk (state=3): >>><<< 12033 1726867168.89482: stderr chunk (state=3): >>><<< 12033 1726867168.89683: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867168.863285-12381-94856979639405=/root/.ansible/tmp/ansible-tmp-1726867168.863285-12381-94856979639405 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867168.89686: variable 'ansible_module_compression' from source: unknown 12033 1726867168.89691: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12033 1726867168.89693: variable 'ansible_facts' from source: unknown 12033 1726867168.89748: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867168.863285-12381-94856979639405/AnsiballZ_stat.py 12033 1726867168.89899: Sending initial data 12033 1726867168.89929: Sent initial data (151 bytes) 12033 1726867168.90315: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867168.90328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867168.90340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867168.90386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867168.90404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867168.90445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867168.91999: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867168.92074: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867168.92129: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpahhvlgzy /root/.ansible/tmp/ansible-tmp-1726867168.863285-12381-94856979639405/AnsiballZ_stat.py <<< 12033 1726867168.92133: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867168.863285-12381-94856979639405/AnsiballZ_stat.py" <<< 12033 1726867168.92170: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpahhvlgzy" to remote "/root/.ansible/tmp/ansible-tmp-1726867168.863285-12381-94856979639405/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867168.863285-12381-94856979639405/AnsiballZ_stat.py" <<< 12033 1726867168.92941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867168.92985: stderr chunk (state=3): >>><<< 12033 1726867168.92993: stdout chunk (state=3): >>><<< 12033 1726867168.93048: done transferring module to remote 12033 1726867168.93056: _low_level_execute_command(): starting 12033 1726867168.93059: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867168.863285-12381-94856979639405/ /root/.ansible/tmp/ansible-tmp-1726867168.863285-12381-94856979639405/AnsiballZ_stat.py && sleep 0' 12033 1726867168.93462: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867168.93492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867168.93496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867168.93498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867168.93543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867168.93546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867168.93599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867168.95383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867168.95401: stdout chunk (state=3): >>><<< 12033 1726867168.95406: stderr chunk (state=3): >>><<< 12033 1726867168.95497: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867168.95501: _low_level_execute_command(): starting 12033 1726867168.95503: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867168.863285-12381-94856979639405/AnsiballZ_stat.py && sleep 0' 12033 1726867168.96002: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867168.96014: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867168.96020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867168.96058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867168.96062: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867168.96064: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867168.96066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867168.96159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867168.96162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867168.96193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867169.11407: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27888, "dev": 23, "nlink": 1, "atime": 1726867167.0668266, "mtime": 1726867167.0668266, "ctime": 1726867167.0668266, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12033 1726867169.12700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867169.12734: stderr chunk (state=3): >>><<< 12033 1726867169.12748: stdout chunk (state=3): >>><<< 12033 1726867169.12783: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27888, "dev": 23, "nlink": 1, "atime": 1726867167.0668266, "mtime": 1726867167.0668266, "ctime": 1726867167.0668266, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867169.12838: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867168.863285-12381-94856979639405/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867169.12928: _low_level_execute_command(): starting 12033 1726867169.12936: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867168.863285-12381-94856979639405/ > /dev/null 2>&1 && sleep 0' 12033 1726867169.13518: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867169.13531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867169.13543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867169.13560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867169.13608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867169.13683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867169.13709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867169.13785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867169.15699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867169.15702: stdout chunk (state=3): >>><<< 12033 1726867169.15704: stderr chunk (state=3): >>><<< 12033 1726867169.15719: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867169.15731: handler run complete 12033 1726867169.15882: attempt loop complete, returning result 12033 1726867169.15885: _execute() done 12033 1726867169.15887: dumping result to json 12033 1726867169.15892: done dumping result, returning 12033 1726867169.15895: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 [0affcac9-a3a5-74bb-502b-00000000019f] 12033 1726867169.15897: sending task result for task 0affcac9-a3a5-74bb-502b-00000000019f ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726867167.0668266, "block_size": 4096, "blocks": 0, "ctime": 1726867167.0668266, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27888, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726867167.0668266, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12033 1726867169.16160: no more pending results, returning what we have 12033 1726867169.16164: results queue empty 12033 1726867169.16165: checking for any_errors_fatal 12033 1726867169.16167: done checking for any_errors_fatal 12033 1726867169.16168: checking for max_fail_percentage 12033 1726867169.16169: done checking for max_fail_percentage 12033 1726867169.16171: checking to see if all hosts have failed and the running result is not ok 12033 1726867169.16171: done checking to see if all hosts have failed 12033 1726867169.16172: getting the remaining hosts for this loop 12033 1726867169.16174: done getting the remaining hosts for this loop 12033 1726867169.16382: getting the next task for host managed_node3 12033 1726867169.16390: done getting next task for host managed_node3 12033 1726867169.16392: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12033 1726867169.16396: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867169.16400: getting variables 12033 1726867169.16401: in VariableManager get_vars() 12033 1726867169.16423: Calling all_inventory to load vars for managed_node3 12033 1726867169.16425: Calling groups_inventory to load vars for managed_node3 12033 1726867169.16428: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867169.16438: Calling all_plugins_play to load vars for managed_node3 12033 1726867169.16441: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867169.16445: Calling groups_plugins_play to load vars for managed_node3 12033 1726867169.16664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867169.16871: done with get_vars() 12033 1726867169.16883: done getting variables 12033 1726867169.16931: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000019f 12033 1726867169.16934: WORKER PROCESS EXITING 12033 1726867169.16960: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867169.17098: variable 'interface' from source: task vars 12033 1726867169.17102: variable 'dhcp_interface2' from source: play vars 12033 1726867169.17161: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:19:29 -0400 (0:00:00.370) 0:00:08.287 ****** 12033 1726867169.17199: entering _queue_task() for managed_node3/assert 12033 1726867169.17423: worker is 1 (out of 1 available) 12033 1726867169.17439: exiting _queue_task() for managed_node3/assert 12033 1726867169.17451: done queuing things up, now waiting for results queue to drain 12033 1726867169.17454: waiting for pending results... 12033 1726867169.17611: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' 12033 1726867169.17728: in run() - task 0affcac9-a3a5-74bb-502b-000000000122 12033 1726867169.17738: variable 'ansible_search_path' from source: unknown 12033 1726867169.17779: variable 'ansible_search_path' from source: unknown 12033 1726867169.17796: calling self._execute() 12033 1726867169.17992: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867169.17996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867169.17999: variable 'omit' from source: magic vars 12033 1726867169.18273: variable 'ansible_distribution_major_version' from source: facts 12033 1726867169.18294: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867169.18307: variable 'omit' from source: magic vars 12033 1726867169.18365: variable 'omit' from source: magic vars 12033 1726867169.18461: variable 'interface' from source: task vars 12033 1726867169.18471: variable 'dhcp_interface2' from source: play vars 12033 1726867169.18559: variable 'dhcp_interface2' from source: play vars 12033 1726867169.18586: variable 'omit' from source: magic vars 12033 1726867169.18640: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867169.18697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867169.18733: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867169.18753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867169.18762: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867169.18787: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867169.18793: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867169.18795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867169.18861: Set connection var ansible_pipelining to False 12033 1726867169.18868: Set connection var ansible_shell_executable to /bin/sh 12033 1726867169.18881: Set connection var ansible_timeout to 10 12033 1726867169.18886: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867169.18891: Set connection var ansible_connection to ssh 12033 1726867169.18894: Set connection var ansible_shell_type to sh 12033 1726867169.18910: variable 'ansible_shell_executable' from source: unknown 12033 1726867169.18913: variable 'ansible_connection' from source: unknown 12033 1726867169.18915: variable 'ansible_module_compression' from source: unknown 12033 1726867169.18920: variable 'ansible_shell_type' from source: unknown 12033 1726867169.18922: variable 'ansible_shell_executable' from source: unknown 12033 1726867169.18924: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867169.18929: variable 'ansible_pipelining' from source: unknown 12033 1726867169.18931: variable 'ansible_timeout' from source: unknown 12033 1726867169.18935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867169.19053: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867169.19075: variable 'omit' from source: magic vars 12033 1726867169.19080: starting attempt loop 12033 1726867169.19083: running the handler 12033 1726867169.19303: variable 'interface_stat' from source: set_fact 12033 1726867169.19306: Evaluated conditional (interface_stat.stat.exists): True 12033 1726867169.19309: handler run complete 12033 1726867169.19311: attempt loop complete, returning result 12033 1726867169.19313: _execute() done 12033 1726867169.19315: dumping result to json 12033 1726867169.19317: done dumping result, returning 12033 1726867169.19319: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' [0affcac9-a3a5-74bb-502b-000000000122] 12033 1726867169.19320: sending task result for task 0affcac9-a3a5-74bb-502b-000000000122 12033 1726867169.19375: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000122 12033 1726867169.19380: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12033 1726867169.19431: no more pending results, returning what we have 12033 1726867169.19435: results queue empty 12033 1726867169.19436: checking for any_errors_fatal 12033 1726867169.19444: done checking for any_errors_fatal 12033 1726867169.19445: checking for max_fail_percentage 12033 1726867169.19448: done checking for max_fail_percentage 12033 1726867169.19449: checking to see if all hosts have failed and the running result is not ok 12033 1726867169.19450: done checking to see if all hosts have failed 12033 1726867169.19450: getting the remaining hosts for this loop 12033 1726867169.19453: done getting the remaining hosts for this loop 12033 1726867169.19456: getting the next task for host managed_node3 12033 1726867169.19466: done getting next task for host managed_node3 12033 1726867169.19470: ^ task is: TASK: Test 12033 1726867169.19474: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867169.19480: getting variables 12033 1726867169.19482: in VariableManager get_vars() 12033 1726867169.19516: Calling all_inventory to load vars for managed_node3 12033 1726867169.19519: Calling groups_inventory to load vars for managed_node3 12033 1726867169.19523: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867169.19535: Calling all_plugins_play to load vars for managed_node3 12033 1726867169.19538: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867169.19541: Calling groups_plugins_play to load vars for managed_node3 12033 1726867169.19896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867169.20136: done with get_vars() 12033 1726867169.20148: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 17:19:29 -0400 (0:00:00.030) 0:00:08.318 ****** 12033 1726867169.20229: entering _queue_task() for managed_node3/include_tasks 12033 1726867169.20404: worker is 1 (out of 1 available) 12033 1726867169.20416: exiting _queue_task() for managed_node3/include_tasks 12033 1726867169.20427: done queuing things up, now waiting for results queue to drain 12033 1726867169.20428: waiting for pending results... 12033 1726867169.20884: running TaskExecutor() for managed_node3/TASK: Test 12033 1726867169.20888: in run() - task 0affcac9-a3a5-74bb-502b-00000000008c 12033 1726867169.20891: variable 'ansible_search_path' from source: unknown 12033 1726867169.20893: variable 'ansible_search_path' from source: unknown 12033 1726867169.20895: variable 'lsr_test' from source: include params 12033 1726867169.21037: variable 'lsr_test' from source: include params 12033 1726867169.21106: variable 'omit' from source: magic vars 12033 1726867169.21211: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867169.21224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867169.21237: variable 'omit' from source: magic vars 12033 1726867169.21453: variable 'ansible_distribution_major_version' from source: facts 12033 1726867169.21468: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867169.21479: variable 'item' from source: unknown 12033 1726867169.21542: variable 'item' from source: unknown 12033 1726867169.21576: variable 'item' from source: unknown 12033 1726867169.21640: variable 'item' from source: unknown 12033 1726867169.21984: dumping result to json 12033 1726867169.21988: done dumping result, returning 12033 1726867169.21991: done running TaskExecutor() for managed_node3/TASK: Test [0affcac9-a3a5-74bb-502b-00000000008c] 12033 1726867169.21994: sending task result for task 0affcac9-a3a5-74bb-502b-00000000008c 12033 1726867169.22036: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000008c 12033 1726867169.22040: WORKER PROCESS EXITING 12033 1726867169.22060: no more pending results, returning what we have 12033 1726867169.22064: in VariableManager get_vars() 12033 1726867169.22095: Calling all_inventory to load vars for managed_node3 12033 1726867169.22098: Calling groups_inventory to load vars for managed_node3 12033 1726867169.22100: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867169.22114: Calling all_plugins_play to load vars for managed_node3 12033 1726867169.22117: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867169.22120: Calling groups_plugins_play to load vars for managed_node3 12033 1726867169.22307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867169.22506: done with get_vars() 12033 1726867169.22514: variable 'ansible_search_path' from source: unknown 12033 1726867169.22515: variable 'ansible_search_path' from source: unknown 12033 1726867169.22555: we have included files to process 12033 1726867169.22556: generating all_blocks data 12033 1726867169.22558: done generating all_blocks data 12033 1726867169.22563: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 12033 1726867169.22565: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 12033 1726867169.22567: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 12033 1726867169.23036: done processing included file 12033 1726867169.23038: iterating over new_blocks loaded from include file 12033 1726867169.23040: in VariableManager get_vars() 12033 1726867169.23052: done with get_vars() 12033 1726867169.23054: filtering new block on tags 12033 1726867169.23098: done filtering new block on tags 12033 1726867169.23101: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml for managed_node3 => (item=tasks/create_bond_profile.yml) 12033 1726867169.23105: extending task lists for all hosts with included blocks 12033 1726867169.25216: done extending task lists 12033 1726867169.25217: done processing included files 12033 1726867169.25218: results queue empty 12033 1726867169.25218: checking for any_errors_fatal 12033 1726867169.25221: done checking for any_errors_fatal 12033 1726867169.25222: checking for max_fail_percentage 12033 1726867169.25223: done checking for max_fail_percentage 12033 1726867169.25224: checking to see if all hosts have failed and the running result is not ok 12033 1726867169.25225: done checking to see if all hosts have failed 12033 1726867169.25225: getting the remaining hosts for this loop 12033 1726867169.25226: done getting the remaining hosts for this loop 12033 1726867169.25229: getting the next task for host managed_node3 12033 1726867169.25233: done getting next task for host managed_node3 12033 1726867169.25235: ^ task is: TASK: Include network role 12033 1726867169.25238: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867169.25240: getting variables 12033 1726867169.25246: in VariableManager get_vars() 12033 1726867169.25254: Calling all_inventory to load vars for managed_node3 12033 1726867169.25256: Calling groups_inventory to load vars for managed_node3 12033 1726867169.25259: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867169.25264: Calling all_plugins_play to load vars for managed_node3 12033 1726867169.25266: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867169.25268: Calling groups_plugins_play to load vars for managed_node3 12033 1726867169.25398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867169.25585: done with get_vars() 12033 1726867169.25598: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml:3 Friday 20 September 2024 17:19:29 -0400 (0:00:00.054) 0:00:08.372 ****** 12033 1726867169.25671: entering _queue_task() for managed_node3/include_role 12033 1726867169.25673: Creating lock for include_role 12033 1726867169.25970: worker is 1 (out of 1 available) 12033 1726867169.25985: exiting _queue_task() for managed_node3/include_role 12033 1726867169.25999: done queuing things up, now waiting for results queue to drain 12033 1726867169.26001: waiting for pending results... 12033 1726867169.26291: running TaskExecutor() for managed_node3/TASK: Include network role 12033 1726867169.26446: in run() - task 0affcac9-a3a5-74bb-502b-0000000001c5 12033 1726867169.26470: variable 'ansible_search_path' from source: unknown 12033 1726867169.26480: variable 'ansible_search_path' from source: unknown 12033 1726867169.26525: calling self._execute() 12033 1726867169.26639: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867169.26653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867169.26675: variable 'omit' from source: magic vars 12033 1726867169.27041: variable 'ansible_distribution_major_version' from source: facts 12033 1726867169.27056: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867169.27066: _execute() done 12033 1726867169.27073: dumping result to json 12033 1726867169.27103: done dumping result, returning 12033 1726867169.27106: done running TaskExecutor() for managed_node3/TASK: Include network role [0affcac9-a3a5-74bb-502b-0000000001c5] 12033 1726867169.27108: sending task result for task 0affcac9-a3a5-74bb-502b-0000000001c5 12033 1726867169.27405: no more pending results, returning what we have 12033 1726867169.27410: in VariableManager get_vars() 12033 1726867169.27449: Calling all_inventory to load vars for managed_node3 12033 1726867169.27452: Calling groups_inventory to load vars for managed_node3 12033 1726867169.27455: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867169.27469: Calling all_plugins_play to load vars for managed_node3 12033 1726867169.27472: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867169.27476: Calling groups_plugins_play to load vars for managed_node3 12033 1726867169.27791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867169.28023: done with get_vars() 12033 1726867169.28032: variable 'ansible_search_path' from source: unknown 12033 1726867169.28033: variable 'ansible_search_path' from source: unknown 12033 1726867169.28217: variable 'omit' from source: magic vars 12033 1726867169.28262: variable 'omit' from source: magic vars 12033 1726867169.28278: variable 'omit' from source: magic vars 12033 1726867169.28282: we have included files to process 12033 1726867169.28283: generating all_blocks data 12033 1726867169.28284: done generating all_blocks data 12033 1726867169.28285: processing included file: fedora.linux_system_roles.network 12033 1726867169.28308: in VariableManager get_vars() 12033 1726867169.28318: done with get_vars() 12033 1726867169.28353: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000001c5 12033 1726867169.28356: WORKER PROCESS EXITING 12033 1726867169.28404: in VariableManager get_vars() 12033 1726867169.28422: done with get_vars() 12033 1726867169.28479: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12033 1726867169.28816: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12033 1726867169.29065: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12033 1726867169.30390: in VariableManager get_vars() 12033 1726867169.30413: done with get_vars() 12033 1726867169.30845: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12033 1726867169.33073: iterating over new_blocks loaded from include file 12033 1726867169.33075: in VariableManager get_vars() 12033 1726867169.33092: done with get_vars() 12033 1726867169.33094: filtering new block on tags 12033 1726867169.33714: done filtering new block on tags 12033 1726867169.33718: in VariableManager get_vars() 12033 1726867169.33733: done with get_vars() 12033 1726867169.33734: filtering new block on tags 12033 1726867169.33751: done filtering new block on tags 12033 1726867169.33753: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 12033 1726867169.33757: extending task lists for all hosts with included blocks 12033 1726867169.34002: done extending task lists 12033 1726867169.34004: done processing included files 12033 1726867169.34007: results queue empty 12033 1726867169.34008: checking for any_errors_fatal 12033 1726867169.34012: done checking for any_errors_fatal 12033 1726867169.34015: checking for max_fail_percentage 12033 1726867169.34016: done checking for max_fail_percentage 12033 1726867169.34017: checking to see if all hosts have failed and the running result is not ok 12033 1726867169.34018: done checking to see if all hosts have failed 12033 1726867169.34018: getting the remaining hosts for this loop 12033 1726867169.34020: done getting the remaining hosts for this loop 12033 1726867169.34022: getting the next task for host managed_node3 12033 1726867169.34027: done getting next task for host managed_node3 12033 1726867169.34029: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12033 1726867169.34059: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867169.34069: getting variables 12033 1726867169.34071: in VariableManager get_vars() 12033 1726867169.34086: Calling all_inventory to load vars for managed_node3 12033 1726867169.34088: Calling groups_inventory to load vars for managed_node3 12033 1726867169.34090: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867169.34095: Calling all_plugins_play to load vars for managed_node3 12033 1726867169.34097: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867169.34100: Calling groups_plugins_play to load vars for managed_node3 12033 1726867169.34423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867169.34628: done with get_vars() 12033 1726867169.34637: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:19:29 -0400 (0:00:00.090) 0:00:08.462 ****** 12033 1726867169.34709: entering _queue_task() for managed_node3/include_tasks 12033 1726867169.34970: worker is 1 (out of 1 available) 12033 1726867169.34986: exiting _queue_task() for managed_node3/include_tasks 12033 1726867169.34998: done queuing things up, now waiting for results queue to drain 12033 1726867169.34999: waiting for pending results... 12033 1726867169.35230: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12033 1726867169.35352: in run() - task 0affcac9-a3a5-74bb-502b-000000000277 12033 1726867169.35360: variable 'ansible_search_path' from source: unknown 12033 1726867169.35363: variable 'ansible_search_path' from source: unknown 12033 1726867169.35400: calling self._execute() 12033 1726867169.35469: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867169.35473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867169.35485: variable 'omit' from source: magic vars 12033 1726867169.35818: variable 'ansible_distribution_major_version' from source: facts 12033 1726867169.35828: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867169.35834: _execute() done 12033 1726867169.35837: dumping result to json 12033 1726867169.35840: done dumping result, returning 12033 1726867169.35847: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-74bb-502b-000000000277] 12033 1726867169.35850: sending task result for task 0affcac9-a3a5-74bb-502b-000000000277 12033 1726867169.36030: no more pending results, returning what we have 12033 1726867169.36035: in VariableManager get_vars() 12033 1726867169.36065: Calling all_inventory to load vars for managed_node3 12033 1726867169.36068: Calling groups_inventory to load vars for managed_node3 12033 1726867169.36070: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867169.36079: Calling all_plugins_play to load vars for managed_node3 12033 1726867169.36082: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867169.36085: Calling groups_plugins_play to load vars for managed_node3 12033 1726867169.36241: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000277 12033 1726867169.36245: WORKER PROCESS EXITING 12033 1726867169.36266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867169.36514: done with get_vars() 12033 1726867169.36522: variable 'ansible_search_path' from source: unknown 12033 1726867169.36523: variable 'ansible_search_path' from source: unknown 12033 1726867169.36562: we have included files to process 12033 1726867169.36563: generating all_blocks data 12033 1726867169.36565: done generating all_blocks data 12033 1726867169.36568: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12033 1726867169.36569: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12033 1726867169.36571: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12033 1726867169.37239: done processing included file 12033 1726867169.37241: iterating over new_blocks loaded from include file 12033 1726867169.37242: in VariableManager get_vars() 12033 1726867169.37264: done with get_vars() 12033 1726867169.37265: filtering new block on tags 12033 1726867169.37300: done filtering new block on tags 12033 1726867169.37303: in VariableManager get_vars() 12033 1726867169.37324: done with get_vars() 12033 1726867169.37326: filtering new block on tags 12033 1726867169.37368: done filtering new block on tags 12033 1726867169.37370: in VariableManager get_vars() 12033 1726867169.37397: done with get_vars() 12033 1726867169.37399: filtering new block on tags 12033 1726867169.37442: done filtering new block on tags 12033 1726867169.37445: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 12033 1726867169.37450: extending task lists for all hosts with included blocks 12033 1726867169.39170: done extending task lists 12033 1726867169.39171: done processing included files 12033 1726867169.39172: results queue empty 12033 1726867169.39173: checking for any_errors_fatal 12033 1726867169.39175: done checking for any_errors_fatal 12033 1726867169.39175: checking for max_fail_percentage 12033 1726867169.39176: done checking for max_fail_percentage 12033 1726867169.39179: checking to see if all hosts have failed and the running result is not ok 12033 1726867169.39180: done checking to see if all hosts have failed 12033 1726867169.39180: getting the remaining hosts for this loop 12033 1726867169.39182: done getting the remaining hosts for this loop 12033 1726867169.39184: getting the next task for host managed_node3 12033 1726867169.39188: done getting next task for host managed_node3 12033 1726867169.39191: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12033 1726867169.39194: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867169.39203: getting variables 12033 1726867169.39204: in VariableManager get_vars() 12033 1726867169.39215: Calling all_inventory to load vars for managed_node3 12033 1726867169.39217: Calling groups_inventory to load vars for managed_node3 12033 1726867169.39224: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867169.39228: Calling all_plugins_play to load vars for managed_node3 12033 1726867169.39231: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867169.39234: Calling groups_plugins_play to load vars for managed_node3 12033 1726867169.39375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867169.39587: done with get_vars() 12033 1726867169.39596: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:19:29 -0400 (0:00:00.049) 0:00:08.512 ****** 12033 1726867169.39667: entering _queue_task() for managed_node3/setup 12033 1726867169.39918: worker is 1 (out of 1 available) 12033 1726867169.39930: exiting _queue_task() for managed_node3/setup 12033 1726867169.39941: done queuing things up, now waiting for results queue to drain 12033 1726867169.39943: waiting for pending results... 12033 1726867169.40201: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12033 1726867169.40347: in run() - task 0affcac9-a3a5-74bb-502b-0000000002d4 12033 1726867169.40382: variable 'ansible_search_path' from source: unknown 12033 1726867169.40385: variable 'ansible_search_path' from source: unknown 12033 1726867169.40416: calling self._execute() 12033 1726867169.40529: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867169.40536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867169.40538: variable 'omit' from source: magic vars 12033 1726867169.40892: variable 'ansible_distribution_major_version' from source: facts 12033 1726867169.40909: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867169.41165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867169.43346: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867169.43358: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867169.43401: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867169.43444: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867169.43482: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867169.43571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867169.43610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867169.43647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867169.43697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867169.43716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867169.43779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867169.43883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867169.43886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867169.43888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867169.43890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867169.44051: variable '__network_required_facts' from source: role '' defaults 12033 1726867169.44063: variable 'ansible_facts' from source: unknown 12033 1726867169.44154: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12033 1726867169.44164: when evaluation is False, skipping this task 12033 1726867169.44172: _execute() done 12033 1726867169.44180: dumping result to json 12033 1726867169.44188: done dumping result, returning 12033 1726867169.44228: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-74bb-502b-0000000002d4] 12033 1726867169.44231: sending task result for task 0affcac9-a3a5-74bb-502b-0000000002d4 12033 1726867169.44312: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000002d4 12033 1726867169.44315: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867169.44368: no more pending results, returning what we have 12033 1726867169.44371: results queue empty 12033 1726867169.44372: checking for any_errors_fatal 12033 1726867169.44374: done checking for any_errors_fatal 12033 1726867169.44374: checking for max_fail_percentage 12033 1726867169.44376: done checking for max_fail_percentage 12033 1726867169.44379: checking to see if all hosts have failed and the running result is not ok 12033 1726867169.44380: done checking to see if all hosts have failed 12033 1726867169.44380: getting the remaining hosts for this loop 12033 1726867169.44382: done getting the remaining hosts for this loop 12033 1726867169.44385: getting the next task for host managed_node3 12033 1726867169.44394: done getting next task for host managed_node3 12033 1726867169.44397: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12033 1726867169.44404: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867169.44417: getting variables 12033 1726867169.44419: in VariableManager get_vars() 12033 1726867169.44455: Calling all_inventory to load vars for managed_node3 12033 1726867169.44458: Calling groups_inventory to load vars for managed_node3 12033 1726867169.44460: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867169.44469: Calling all_plugins_play to load vars for managed_node3 12033 1726867169.44472: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867169.44488: Calling groups_plugins_play to load vars for managed_node3 12033 1726867169.44645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867169.44767: done with get_vars() 12033 1726867169.44775: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:19:29 -0400 (0:00:00.051) 0:00:08.564 ****** 12033 1726867169.44842: entering _queue_task() for managed_node3/stat 12033 1726867169.45023: worker is 1 (out of 1 available) 12033 1726867169.45036: exiting _queue_task() for managed_node3/stat 12033 1726867169.45049: done queuing things up, now waiting for results queue to drain 12033 1726867169.45051: waiting for pending results... 12033 1726867169.45205: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 12033 1726867169.45293: in run() - task 0affcac9-a3a5-74bb-502b-0000000002d6 12033 1726867169.45307: variable 'ansible_search_path' from source: unknown 12033 1726867169.45311: variable 'ansible_search_path' from source: unknown 12033 1726867169.45336: calling self._execute() 12033 1726867169.45401: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867169.45405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867169.45414: variable 'omit' from source: magic vars 12033 1726867169.45661: variable 'ansible_distribution_major_version' from source: facts 12033 1726867169.45670: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867169.45781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867169.45961: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867169.46010: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867169.46038: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867169.46093: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867169.46400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867169.46402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867169.46405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867169.46407: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867169.46418: variable '__network_is_ostree' from source: set_fact 12033 1726867169.46430: Evaluated conditional (not __network_is_ostree is defined): False 12033 1726867169.46438: when evaluation is False, skipping this task 12033 1726867169.46444: _execute() done 12033 1726867169.46450: dumping result to json 12033 1726867169.46456: done dumping result, returning 12033 1726867169.46466: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-74bb-502b-0000000002d6] 12033 1726867169.46473: sending task result for task 0affcac9-a3a5-74bb-502b-0000000002d6 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12033 1726867169.46675: no more pending results, returning what we have 12033 1726867169.46681: results queue empty 12033 1726867169.46683: checking for any_errors_fatal 12033 1726867169.46692: done checking for any_errors_fatal 12033 1726867169.46693: checking for max_fail_percentage 12033 1726867169.46695: done checking for max_fail_percentage 12033 1726867169.46696: checking to see if all hosts have failed and the running result is not ok 12033 1726867169.46696: done checking to see if all hosts have failed 12033 1726867169.46697: getting the remaining hosts for this loop 12033 1726867169.46699: done getting the remaining hosts for this loop 12033 1726867169.46702: getting the next task for host managed_node3 12033 1726867169.46710: done getting next task for host managed_node3 12033 1726867169.46713: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12033 1726867169.46719: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867169.46732: getting variables 12033 1726867169.46764: in VariableManager get_vars() 12033 1726867169.46962: Calling all_inventory to load vars for managed_node3 12033 1726867169.46965: Calling groups_inventory to load vars for managed_node3 12033 1726867169.46968: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867169.46974: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000002d6 12033 1726867169.46979: WORKER PROCESS EXITING 12033 1726867169.47003: Calling all_plugins_play to load vars for managed_node3 12033 1726867169.47007: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867169.47011: Calling groups_plugins_play to load vars for managed_node3 12033 1726867169.47189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867169.47316: done with get_vars() 12033 1726867169.47323: done getting variables 12033 1726867169.47358: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:19:29 -0400 (0:00:00.025) 0:00:08.589 ****** 12033 1726867169.47382: entering _queue_task() for managed_node3/set_fact 12033 1726867169.47555: worker is 1 (out of 1 available) 12033 1726867169.47567: exiting _queue_task() for managed_node3/set_fact 12033 1726867169.47581: done queuing things up, now waiting for results queue to drain 12033 1726867169.47583: waiting for pending results... 12033 1726867169.47728: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12033 1726867169.47813: in run() - task 0affcac9-a3a5-74bb-502b-0000000002d7 12033 1726867169.47823: variable 'ansible_search_path' from source: unknown 12033 1726867169.47827: variable 'ansible_search_path' from source: unknown 12033 1726867169.47855: calling self._execute() 12033 1726867169.47914: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867169.47918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867169.47925: variable 'omit' from source: magic vars 12033 1726867169.48214: variable 'ansible_distribution_major_version' from source: facts 12033 1726867169.48222: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867169.48329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867169.48502: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867169.48532: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867169.48555: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867169.48580: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867169.48638: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867169.48655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867169.48673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867169.48697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867169.48755: variable '__network_is_ostree' from source: set_fact 12033 1726867169.48760: Evaluated conditional (not __network_is_ostree is defined): False 12033 1726867169.48763: when evaluation is False, skipping this task 12033 1726867169.48766: _execute() done 12033 1726867169.48769: dumping result to json 12033 1726867169.48772: done dumping result, returning 12033 1726867169.48780: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-74bb-502b-0000000002d7] 12033 1726867169.48783: sending task result for task 0affcac9-a3a5-74bb-502b-0000000002d7 12033 1726867169.48862: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000002d7 12033 1726867169.48865: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12033 1726867169.48931: no more pending results, returning what we have 12033 1726867169.48934: results queue empty 12033 1726867169.48934: checking for any_errors_fatal 12033 1726867169.48939: done checking for any_errors_fatal 12033 1726867169.48939: checking for max_fail_percentage 12033 1726867169.48941: done checking for max_fail_percentage 12033 1726867169.48941: checking to see if all hosts have failed and the running result is not ok 12033 1726867169.48942: done checking to see if all hosts have failed 12033 1726867169.48943: getting the remaining hosts for this loop 12033 1726867169.48944: done getting the remaining hosts for this loop 12033 1726867169.48947: getting the next task for host managed_node3 12033 1726867169.48957: done getting next task for host managed_node3 12033 1726867169.48960: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12033 1726867169.48965: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867169.48984: getting variables 12033 1726867169.48988: in VariableManager get_vars() 12033 1726867169.49014: Calling all_inventory to load vars for managed_node3 12033 1726867169.49016: Calling groups_inventory to load vars for managed_node3 12033 1726867169.49017: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867169.49023: Calling all_plugins_play to load vars for managed_node3 12033 1726867169.49024: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867169.49026: Calling groups_plugins_play to load vars for managed_node3 12033 1726867169.49222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867169.49442: done with get_vars() 12033 1726867169.49451: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:19:29 -0400 (0:00:00.021) 0:00:08.611 ****** 12033 1726867169.49547: entering _queue_task() for managed_node3/service_facts 12033 1726867169.49548: Creating lock for service_facts 12033 1726867169.49797: worker is 1 (out of 1 available) 12033 1726867169.49811: exiting _queue_task() for managed_node3/service_facts 12033 1726867169.49966: done queuing things up, now waiting for results queue to drain 12033 1726867169.49968: waiting for pending results... 12033 1726867169.50204: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 12033 1726867169.50338: in run() - task 0affcac9-a3a5-74bb-502b-0000000002d9 12033 1726867169.50342: variable 'ansible_search_path' from source: unknown 12033 1726867169.50346: variable 'ansible_search_path' from source: unknown 12033 1726867169.50391: calling self._execute() 12033 1726867169.50426: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867169.50430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867169.50440: variable 'omit' from source: magic vars 12033 1726867169.50688: variable 'ansible_distribution_major_version' from source: facts 12033 1726867169.50703: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867169.50706: variable 'omit' from source: magic vars 12033 1726867169.50753: variable 'omit' from source: magic vars 12033 1726867169.50775: variable 'omit' from source: magic vars 12033 1726867169.50808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867169.50835: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867169.50849: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867169.50862: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867169.50871: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867169.50897: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867169.50900: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867169.50902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867169.50969: Set connection var ansible_pipelining to False 12033 1726867169.50976: Set connection var ansible_shell_executable to /bin/sh 12033 1726867169.50985: Set connection var ansible_timeout to 10 12033 1726867169.50989: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867169.50995: Set connection var ansible_connection to ssh 12033 1726867169.51000: Set connection var ansible_shell_type to sh 12033 1726867169.51015: variable 'ansible_shell_executable' from source: unknown 12033 1726867169.51018: variable 'ansible_connection' from source: unknown 12033 1726867169.51021: variable 'ansible_module_compression' from source: unknown 12033 1726867169.51023: variable 'ansible_shell_type' from source: unknown 12033 1726867169.51025: variable 'ansible_shell_executable' from source: unknown 12033 1726867169.51029: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867169.51031: variable 'ansible_pipelining' from source: unknown 12033 1726867169.51033: variable 'ansible_timeout' from source: unknown 12033 1726867169.51038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867169.51184: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867169.51189: variable 'omit' from source: magic vars 12033 1726867169.51191: starting attempt loop 12033 1726867169.51194: running the handler 12033 1726867169.51256: _low_level_execute_command(): starting 12033 1726867169.51261: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867169.51884: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867169.51888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867169.51890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867169.51893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867169.51922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867169.51939: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867169.51957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867169.52037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867169.52113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867169.52176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867169.52397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867169.53910: stdout chunk (state=3): >>>/root <<< 12033 1726867169.54010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867169.54040: stderr chunk (state=3): >>><<< 12033 1726867169.54051: stdout chunk (state=3): >>><<< 12033 1726867169.54067: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867169.54083: _low_level_execute_command(): starting 12033 1726867169.54093: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867169.5407102-12422-241442038217282 `" && echo ansible-tmp-1726867169.5407102-12422-241442038217282="` echo /root/.ansible/tmp/ansible-tmp-1726867169.5407102-12422-241442038217282 `" ) && sleep 0' 12033 1726867169.54560: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867169.54595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867169.54685: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867169.54692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867169.54787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867169.54926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867169.56802: stdout chunk (state=3): >>>ansible-tmp-1726867169.5407102-12422-241442038217282=/root/.ansible/tmp/ansible-tmp-1726867169.5407102-12422-241442038217282 <<< 12033 1726867169.57082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867169.57086: stdout chunk (state=3): >>><<< 12033 1726867169.57091: stderr chunk (state=3): >>><<< 12033 1726867169.57094: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867169.5407102-12422-241442038217282=/root/.ansible/tmp/ansible-tmp-1726867169.5407102-12422-241442038217282 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867169.57096: variable 'ansible_module_compression' from source: unknown 12033 1726867169.57098: ANSIBALLZ: Using lock for service_facts 12033 1726867169.57100: ANSIBALLZ: Acquiring lock 12033 1726867169.57101: ANSIBALLZ: Lock acquired: 139897894885312 12033 1726867169.57103: ANSIBALLZ: Creating module 12033 1726867169.71106: ANSIBALLZ: Writing module into payload 12033 1726867169.71225: ANSIBALLZ: Writing module 12033 1726867169.71256: ANSIBALLZ: Renaming module 12033 1726867169.71274: ANSIBALLZ: Done creating module 12033 1726867169.71308: variable 'ansible_facts' from source: unknown 12033 1726867169.71400: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867169.5407102-12422-241442038217282/AnsiballZ_service_facts.py 12033 1726867169.71540: Sending initial data 12033 1726867169.71610: Sent initial data (162 bytes) 12033 1726867169.72293: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867169.72309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867169.72374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867169.72438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867169.72455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867169.72492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867169.72694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867169.74255: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867169.74300: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867169.74353: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmprzbz2fzt /root/.ansible/tmp/ansible-tmp-1726867169.5407102-12422-241442038217282/AnsiballZ_service_facts.py <<< 12033 1726867169.74357: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867169.5407102-12422-241442038217282/AnsiballZ_service_facts.py" <<< 12033 1726867169.74417: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmprzbz2fzt" to remote "/root/.ansible/tmp/ansible-tmp-1726867169.5407102-12422-241442038217282/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867169.5407102-12422-241442038217282/AnsiballZ_service_facts.py" <<< 12033 1726867169.75538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867169.75594: stderr chunk (state=3): >>><<< 12033 1726867169.75598: stdout chunk (state=3): >>><<< 12033 1726867169.75640: done transferring module to remote 12033 1726867169.75649: _low_level_execute_command(): starting 12033 1726867169.75654: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867169.5407102-12422-241442038217282/ /root/.ansible/tmp/ansible-tmp-1726867169.5407102-12422-241442038217282/AnsiballZ_service_facts.py && sleep 0' 12033 1726867169.76200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867169.76209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867169.76220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867169.76234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867169.76246: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867169.76253: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867169.76295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867169.76352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867169.76362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867169.76396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867169.76515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867169.78418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867169.78422: stdout chunk (state=3): >>><<< 12033 1726867169.78424: stderr chunk (state=3): >>><<< 12033 1726867169.78625: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867169.78628: _low_level_execute_command(): starting 12033 1726867169.78631: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867169.5407102-12422-241442038217282/AnsiballZ_service_facts.py && sleep 0' 12033 1726867169.79550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867169.79594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867169.79606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867169.79622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867169.79636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867169.79771: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867169.79922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867169.79968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867171.31991: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 12033 1726867171.32353: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12033 1726867171.33985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867171.33992: stdout chunk (state=3): >>><<< 12033 1726867171.33995: stderr chunk (state=3): >>><<< 12033 1726867171.33999: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867171.35812: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867169.5407102-12422-241442038217282/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867171.36083: _low_level_execute_command(): starting 12033 1726867171.36087: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867169.5407102-12422-241442038217282/ > /dev/null 2>&1 && sleep 0' 12033 1726867171.37225: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867171.37229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867171.37231: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867171.37234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867171.37236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867171.37283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867171.37596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867171.39510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867171.39513: stdout chunk (state=3): >>><<< 12033 1726867171.39515: stderr chunk (state=3): >>><<< 12033 1726867171.39683: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867171.39686: handler run complete 12033 1726867171.40152: variable 'ansible_facts' from source: unknown 12033 1726867171.40650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867171.42047: variable 'ansible_facts' from source: unknown 12033 1726867171.45609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867171.46603: attempt loop complete, returning result 12033 1726867171.46607: _execute() done 12033 1726867171.46609: dumping result to json 12033 1726867171.46611: done dumping result, returning 12033 1726867171.46613: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-74bb-502b-0000000002d9] 12033 1726867171.46615: sending task result for task 0affcac9-a3a5-74bb-502b-0000000002d9 12033 1726867171.48923: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000002d9 12033 1726867171.48927: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867171.49028: no more pending results, returning what we have 12033 1726867171.49031: results queue empty 12033 1726867171.49032: checking for any_errors_fatal 12033 1726867171.49037: done checking for any_errors_fatal 12033 1726867171.49038: checking for max_fail_percentage 12033 1726867171.49040: done checking for max_fail_percentage 12033 1726867171.49040: checking to see if all hosts have failed and the running result is not ok 12033 1726867171.49041: done checking to see if all hosts have failed 12033 1726867171.49042: getting the remaining hosts for this loop 12033 1726867171.49043: done getting the remaining hosts for this loop 12033 1726867171.49047: getting the next task for host managed_node3 12033 1726867171.49052: done getting next task for host managed_node3 12033 1726867171.49056: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12033 1726867171.49061: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867171.49070: getting variables 12033 1726867171.49071: in VariableManager get_vars() 12033 1726867171.49104: Calling all_inventory to load vars for managed_node3 12033 1726867171.49107: Calling groups_inventory to load vars for managed_node3 12033 1726867171.49109: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867171.49117: Calling all_plugins_play to load vars for managed_node3 12033 1726867171.49120: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867171.49123: Calling groups_plugins_play to load vars for managed_node3 12033 1726867171.49987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867171.51103: done with get_vars() 12033 1726867171.51117: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:19:31 -0400 (0:00:02.016) 0:00:10.627 ****** 12033 1726867171.51210: entering _queue_task() for managed_node3/package_facts 12033 1726867171.51212: Creating lock for package_facts 12033 1726867171.51704: worker is 1 (out of 1 available) 12033 1726867171.51716: exiting _queue_task() for managed_node3/package_facts 12033 1726867171.51729: done queuing things up, now waiting for results queue to drain 12033 1726867171.51731: waiting for pending results... 12033 1726867171.52497: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 12033 1726867171.52558: in run() - task 0affcac9-a3a5-74bb-502b-0000000002da 12033 1726867171.52638: variable 'ansible_search_path' from source: unknown 12033 1726867171.52647: variable 'ansible_search_path' from source: unknown 12033 1726867171.52734: calling self._execute() 12033 1726867171.52884: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867171.52891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867171.52895: variable 'omit' from source: magic vars 12033 1726867171.53695: variable 'ansible_distribution_major_version' from source: facts 12033 1726867171.53717: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867171.53728: variable 'omit' from source: magic vars 12033 1726867171.53947: variable 'omit' from source: magic vars 12033 1726867171.54019: variable 'omit' from source: magic vars 12033 1726867171.54124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867171.54221: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867171.54364: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867171.54474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867171.54480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867171.54482: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867171.54485: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867171.54487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867171.54662: Set connection var ansible_pipelining to False 12033 1726867171.54706: Set connection var ansible_shell_executable to /bin/sh 12033 1726867171.54911: Set connection var ansible_timeout to 10 12033 1726867171.54915: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867171.54917: Set connection var ansible_connection to ssh 12033 1726867171.54919: Set connection var ansible_shell_type to sh 12033 1726867171.54921: variable 'ansible_shell_executable' from source: unknown 12033 1726867171.54923: variable 'ansible_connection' from source: unknown 12033 1726867171.54926: variable 'ansible_module_compression' from source: unknown 12033 1726867171.54928: variable 'ansible_shell_type' from source: unknown 12033 1726867171.54930: variable 'ansible_shell_executable' from source: unknown 12033 1726867171.54932: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867171.54934: variable 'ansible_pipelining' from source: unknown 12033 1726867171.54936: variable 'ansible_timeout' from source: unknown 12033 1726867171.54938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867171.55283: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867171.55315: variable 'omit' from source: magic vars 12033 1726867171.55393: starting attempt loop 12033 1726867171.55402: running the handler 12033 1726867171.55551: _low_level_execute_command(): starting 12033 1726867171.55554: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867171.56935: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867171.56969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867171.56972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867171.56975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867171.57186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867171.57194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867171.57197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867171.57200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867171.57408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867171.59089: stdout chunk (state=3): >>>/root <<< 12033 1726867171.59238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867171.59248: stdout chunk (state=3): >>><<< 12033 1726867171.59260: stderr chunk (state=3): >>><<< 12033 1726867171.59283: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867171.59308: _low_level_execute_command(): starting 12033 1726867171.59348: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867171.5928965-12525-146572037403733 `" && echo ansible-tmp-1726867171.5928965-12525-146572037403733="` echo /root/.ansible/tmp/ansible-tmp-1726867171.5928965-12525-146572037403733 `" ) && sleep 0' 12033 1726867171.60493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867171.60508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867171.60525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867171.60544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867171.60646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867171.60912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867171.60954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867171.62853: stdout chunk (state=3): >>>ansible-tmp-1726867171.5928965-12525-146572037403733=/root/.ansible/tmp/ansible-tmp-1726867171.5928965-12525-146572037403733 <<< 12033 1726867171.63060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867171.63063: stdout chunk (state=3): >>><<< 12033 1726867171.63065: stderr chunk (state=3): >>><<< 12033 1726867171.63081: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867171.5928965-12525-146572037403733=/root/.ansible/tmp/ansible-tmp-1726867171.5928965-12525-146572037403733 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867171.63129: variable 'ansible_module_compression' from source: unknown 12033 1726867171.63482: ANSIBALLZ: Using lock for package_facts 12033 1726867171.63486: ANSIBALLZ: Acquiring lock 12033 1726867171.63488: ANSIBALLZ: Lock acquired: 139897895165136 12033 1726867171.63490: ANSIBALLZ: Creating module 12033 1726867172.25162: ANSIBALLZ: Writing module into payload 12033 1726867172.25487: ANSIBALLZ: Writing module 12033 1726867172.25575: ANSIBALLZ: Renaming module 12033 1726867172.25593: ANSIBALLZ: Done creating module 12033 1726867172.25699: variable 'ansible_facts' from source: unknown 12033 1726867172.26135: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867171.5928965-12525-146572037403733/AnsiballZ_package_facts.py 12033 1726867172.26662: Sending initial data 12033 1726867172.26665: Sent initial data (162 bytes) 12033 1726867172.27970: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867172.28031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867172.28358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867172.28465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867172.30126: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867172.30161: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867172.30231: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpt3xyku2z /root/.ansible/tmp/ansible-tmp-1726867171.5928965-12525-146572037403733/AnsiballZ_package_facts.py <<< 12033 1726867172.30235: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867171.5928965-12525-146572037403733/AnsiballZ_package_facts.py" <<< 12033 1726867172.30557: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpt3xyku2z" to remote "/root/.ansible/tmp/ansible-tmp-1726867171.5928965-12525-146572037403733/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867171.5928965-12525-146572037403733/AnsiballZ_package_facts.py" <<< 12033 1726867172.33171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867172.33367: stderr chunk (state=3): >>><<< 12033 1726867172.33370: stdout chunk (state=3): >>><<< 12033 1726867172.33372: done transferring module to remote 12033 1726867172.33374: _low_level_execute_command(): starting 12033 1726867172.33378: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867171.5928965-12525-146572037403733/ /root/.ansible/tmp/ansible-tmp-1726867171.5928965-12525-146572037403733/AnsiballZ_package_facts.py && sleep 0' 12033 1726867172.34699: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867172.34807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867172.34824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867172.34844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867172.34920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867172.36781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867172.36856: stderr chunk (state=3): >>><<< 12033 1726867172.36860: stdout chunk (state=3): >>><<< 12033 1726867172.36875: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867172.36887: _low_level_execute_command(): starting 12033 1726867172.36900: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867171.5928965-12525-146572037403733/AnsiballZ_package_facts.py && sleep 0' 12033 1726867172.37920: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867172.37961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867172.37983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867172.38013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867172.38217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867172.82302: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap":<<< 12033 1726867172.82573: stdout chunk (state=3): >>> [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12033 1726867172.84386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867172.84393: stdout chunk (state=3): >>><<< 12033 1726867172.84396: stderr chunk (state=3): >>><<< 12033 1726867172.84405: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867172.88368: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867171.5928965-12525-146572037403733/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867172.88454: _low_level_execute_command(): starting 12033 1726867172.88457: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867171.5928965-12525-146572037403733/ > /dev/null 2>&1 && sleep 0' 12033 1726867172.89292: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867172.89314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867172.89412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867172.89482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867172.89532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867172.89564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867172.89806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867172.91573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867172.91579: stdout chunk (state=3): >>><<< 12033 1726867172.91583: stderr chunk (state=3): >>><<< 12033 1726867172.91888: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867172.91894: handler run complete 12033 1726867172.94472: variable 'ansible_facts' from source: unknown 12033 1726867172.95410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867172.99599: variable 'ansible_facts' from source: unknown 12033 1726867173.00542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867173.02102: attempt loop complete, returning result 12033 1726867173.02153: _execute() done 12033 1726867173.02161: dumping result to json 12033 1726867173.02548: done dumping result, returning 12033 1726867173.02784: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-74bb-502b-0000000002da] 12033 1726867173.02791: sending task result for task 0affcac9-a3a5-74bb-502b-0000000002da 12033 1726867173.14168: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000002da 12033 1726867173.14171: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867173.14268: no more pending results, returning what we have 12033 1726867173.14271: results queue empty 12033 1726867173.14272: checking for any_errors_fatal 12033 1726867173.14275: done checking for any_errors_fatal 12033 1726867173.14276: checking for max_fail_percentage 12033 1726867173.14279: done checking for max_fail_percentage 12033 1726867173.14280: checking to see if all hosts have failed and the running result is not ok 12033 1726867173.14280: done checking to see if all hosts have failed 12033 1726867173.14281: getting the remaining hosts for this loop 12033 1726867173.14282: done getting the remaining hosts for this loop 12033 1726867173.14286: getting the next task for host managed_node3 12033 1726867173.14294: done getting next task for host managed_node3 12033 1726867173.14298: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12033 1726867173.14303: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867173.14312: getting variables 12033 1726867173.14313: in VariableManager get_vars() 12033 1726867173.14336: Calling all_inventory to load vars for managed_node3 12033 1726867173.14339: Calling groups_inventory to load vars for managed_node3 12033 1726867173.14341: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867173.14349: Calling all_plugins_play to load vars for managed_node3 12033 1726867173.14351: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867173.14354: Calling groups_plugins_play to load vars for managed_node3 12033 1726867173.15863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867173.17504: done with get_vars() 12033 1726867173.17523: done getting variables 12033 1726867173.17586: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:19:33 -0400 (0:00:01.664) 0:00:12.292 ****** 12033 1726867173.17624: entering _queue_task() for managed_node3/debug 12033 1726867173.17905: worker is 1 (out of 1 available) 12033 1726867173.17918: exiting _queue_task() for managed_node3/debug 12033 1726867173.17929: done queuing things up, now waiting for results queue to drain 12033 1726867173.17931: waiting for pending results... 12033 1726867173.18305: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 12033 1726867173.18341: in run() - task 0affcac9-a3a5-74bb-502b-000000000278 12033 1726867173.18360: variable 'ansible_search_path' from source: unknown 12033 1726867173.18366: variable 'ansible_search_path' from source: unknown 12033 1726867173.18412: calling self._execute() 12033 1726867173.18495: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867173.18526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867173.18543: variable 'omit' from source: magic vars 12033 1726867173.18902: variable 'ansible_distribution_major_version' from source: facts 12033 1726867173.18919: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867173.18932: variable 'omit' from source: magic vars 12033 1726867173.19000: variable 'omit' from source: magic vars 12033 1726867173.19102: variable 'network_provider' from source: set_fact 12033 1726867173.19126: variable 'omit' from source: magic vars 12033 1726867173.19175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867173.19217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867173.19276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867173.19281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867173.19297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867173.19332: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867173.19386: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867173.19389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867173.19454: Set connection var ansible_pipelining to False 12033 1726867173.19467: Set connection var ansible_shell_executable to /bin/sh 12033 1726867173.19480: Set connection var ansible_timeout to 10 12033 1726867173.19497: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867173.19504: Set connection var ansible_connection to ssh 12033 1726867173.19513: Set connection var ansible_shell_type to sh 12033 1726867173.19536: variable 'ansible_shell_executable' from source: unknown 12033 1726867173.19582: variable 'ansible_connection' from source: unknown 12033 1726867173.19585: variable 'ansible_module_compression' from source: unknown 12033 1726867173.19587: variable 'ansible_shell_type' from source: unknown 12033 1726867173.19589: variable 'ansible_shell_executable' from source: unknown 12033 1726867173.19591: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867173.19599: variable 'ansible_pipelining' from source: unknown 12033 1726867173.19604: variable 'ansible_timeout' from source: unknown 12033 1726867173.19606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867173.19986: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867173.19989: variable 'omit' from source: magic vars 12033 1726867173.19992: starting attempt loop 12033 1726867173.19994: running the handler 12033 1726867173.20006: handler run complete 12033 1726867173.20025: attempt loop complete, returning result 12033 1726867173.20032: _execute() done 12033 1726867173.20041: dumping result to json 12033 1726867173.20048: done dumping result, returning 12033 1726867173.20060: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-74bb-502b-000000000278] 12033 1726867173.20069: sending task result for task 0affcac9-a3a5-74bb-502b-000000000278 12033 1726867173.20355: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000278 12033 1726867173.20359: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 12033 1726867173.20422: no more pending results, returning what we have 12033 1726867173.20426: results queue empty 12033 1726867173.20427: checking for any_errors_fatal 12033 1726867173.20440: done checking for any_errors_fatal 12033 1726867173.20441: checking for max_fail_percentage 12033 1726867173.20443: done checking for max_fail_percentage 12033 1726867173.20444: checking to see if all hosts have failed and the running result is not ok 12033 1726867173.20445: done checking to see if all hosts have failed 12033 1726867173.20445: getting the remaining hosts for this loop 12033 1726867173.20447: done getting the remaining hosts for this loop 12033 1726867173.20451: getting the next task for host managed_node3 12033 1726867173.20459: done getting next task for host managed_node3 12033 1726867173.20462: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12033 1726867173.20469: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867173.20488: getting variables 12033 1726867173.20490: in VariableManager get_vars() 12033 1726867173.20525: Calling all_inventory to load vars for managed_node3 12033 1726867173.20528: Calling groups_inventory to load vars for managed_node3 12033 1726867173.20531: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867173.20542: Calling all_plugins_play to load vars for managed_node3 12033 1726867173.20545: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867173.20548: Calling groups_plugins_play to load vars for managed_node3 12033 1726867173.22765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867173.24405: done with get_vars() 12033 1726867173.24425: done getting variables 12033 1726867173.24522: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:19:33 -0400 (0:00:00.069) 0:00:12.361 ****** 12033 1726867173.24566: entering _queue_task() for managed_node3/fail 12033 1726867173.24568: Creating lock for fail 12033 1726867173.25091: worker is 1 (out of 1 available) 12033 1726867173.25104: exiting _queue_task() for managed_node3/fail 12033 1726867173.25115: done queuing things up, now waiting for results queue to drain 12033 1726867173.25117: waiting for pending results... 12033 1726867173.25397: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12033 1726867173.25412: in run() - task 0affcac9-a3a5-74bb-502b-000000000279 12033 1726867173.25502: variable 'ansible_search_path' from source: unknown 12033 1726867173.25506: variable 'ansible_search_path' from source: unknown 12033 1726867173.25511: calling self._execute() 12033 1726867173.25564: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867173.25576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867173.25611: variable 'omit' from source: magic vars 12033 1726867173.25974: variable 'ansible_distribution_major_version' from source: facts 12033 1726867173.25995: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867173.26155: variable 'network_state' from source: role '' defaults 12033 1726867173.26161: Evaluated conditional (network_state != {}): False 12033 1726867173.26167: when evaluation is False, skipping this task 12033 1726867173.26169: _execute() done 12033 1726867173.26172: dumping result to json 12033 1726867173.26174: done dumping result, returning 12033 1726867173.26179: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-74bb-502b-000000000279] 12033 1726867173.26190: sending task result for task 0affcac9-a3a5-74bb-502b-000000000279 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867173.26424: no more pending results, returning what we have 12033 1726867173.26431: results queue empty 12033 1726867173.26432: checking for any_errors_fatal 12033 1726867173.26440: done checking for any_errors_fatal 12033 1726867173.26441: checking for max_fail_percentage 12033 1726867173.26444: done checking for max_fail_percentage 12033 1726867173.26447: checking to see if all hosts have failed and the running result is not ok 12033 1726867173.26448: done checking to see if all hosts have failed 12033 1726867173.26448: getting the remaining hosts for this loop 12033 1726867173.26450: done getting the remaining hosts for this loop 12033 1726867173.26454: getting the next task for host managed_node3 12033 1726867173.26463: done getting next task for host managed_node3 12033 1726867173.26466: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12033 1726867173.26472: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867173.26490: getting variables 12033 1726867173.26492: in VariableManager get_vars() 12033 1726867173.26525: Calling all_inventory to load vars for managed_node3 12033 1726867173.26528: Calling groups_inventory to load vars for managed_node3 12033 1726867173.26530: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867173.26542: Calling all_plugins_play to load vars for managed_node3 12033 1726867173.26545: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867173.26548: Calling groups_plugins_play to load vars for managed_node3 12033 1726867173.27190: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000279 12033 1726867173.27194: WORKER PROCESS EXITING 12033 1726867173.27966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867173.30100: done with get_vars() 12033 1726867173.30125: done getting variables 12033 1726867173.30196: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:19:33 -0400 (0:00:00.056) 0:00:12.418 ****** 12033 1726867173.30251: entering _queue_task() for managed_node3/fail 12033 1726867173.30935: worker is 1 (out of 1 available) 12033 1726867173.30987: exiting _queue_task() for managed_node3/fail 12033 1726867173.31003: done queuing things up, now waiting for results queue to drain 12033 1726867173.31005: waiting for pending results... 12033 1726867173.31666: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12033 1726867173.31857: in run() - task 0affcac9-a3a5-74bb-502b-00000000027a 12033 1726867173.31884: variable 'ansible_search_path' from source: unknown 12033 1726867173.31896: variable 'ansible_search_path' from source: unknown 12033 1726867173.31940: calling self._execute() 12033 1726867173.32034: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867173.32085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867173.32091: variable 'omit' from source: magic vars 12033 1726867173.33083: variable 'ansible_distribution_major_version' from source: facts 12033 1726867173.33283: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867173.33287: variable 'network_state' from source: role '' defaults 12033 1726867173.33290: Evaluated conditional (network_state != {}): False 12033 1726867173.33292: when evaluation is False, skipping this task 12033 1726867173.33294: _execute() done 12033 1726867173.33297: dumping result to json 12033 1726867173.33299: done dumping result, returning 12033 1726867173.33302: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-74bb-502b-00000000027a] 12033 1726867173.33304: sending task result for task 0affcac9-a3a5-74bb-502b-00000000027a 12033 1726867173.33375: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000027a 12033 1726867173.33380: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867173.33429: no more pending results, returning what we have 12033 1726867173.33433: results queue empty 12033 1726867173.33434: checking for any_errors_fatal 12033 1726867173.33443: done checking for any_errors_fatal 12033 1726867173.33443: checking for max_fail_percentage 12033 1726867173.33445: done checking for max_fail_percentage 12033 1726867173.33446: checking to see if all hosts have failed and the running result is not ok 12033 1726867173.33447: done checking to see if all hosts have failed 12033 1726867173.33448: getting the remaining hosts for this loop 12033 1726867173.33450: done getting the remaining hosts for this loop 12033 1726867173.33453: getting the next task for host managed_node3 12033 1726867173.33461: done getting next task for host managed_node3 12033 1726867173.33465: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12033 1726867173.33471: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867173.33489: getting variables 12033 1726867173.33491: in VariableManager get_vars() 12033 1726867173.33525: Calling all_inventory to load vars for managed_node3 12033 1726867173.33528: Calling groups_inventory to load vars for managed_node3 12033 1726867173.33530: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867173.33541: Calling all_plugins_play to load vars for managed_node3 12033 1726867173.33544: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867173.33547: Calling groups_plugins_play to load vars for managed_node3 12033 1726867173.36160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867173.38073: done with get_vars() 12033 1726867173.38099: done getting variables 12033 1726867173.38154: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:19:33 -0400 (0:00:00.079) 0:00:12.497 ****** 12033 1726867173.38195: entering _queue_task() for managed_node3/fail 12033 1726867173.38572: worker is 1 (out of 1 available) 12033 1726867173.38587: exiting _queue_task() for managed_node3/fail 12033 1726867173.38600: done queuing things up, now waiting for results queue to drain 12033 1726867173.38602: waiting for pending results... 12033 1726867173.38884: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12033 1726867173.39191: in run() - task 0affcac9-a3a5-74bb-502b-00000000027b 12033 1726867173.39196: variable 'ansible_search_path' from source: unknown 12033 1726867173.39198: variable 'ansible_search_path' from source: unknown 12033 1726867173.39201: calling self._execute() 12033 1726867173.39203: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867173.39207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867173.39209: variable 'omit' from source: magic vars 12033 1726867173.39536: variable 'ansible_distribution_major_version' from source: facts 12033 1726867173.39547: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867173.39724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867173.42085: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867173.42088: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867173.42131: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867173.42164: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867173.42192: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867173.42272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867173.42306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867173.42332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867173.42382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867173.42399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867173.42492: variable 'ansible_distribution_major_version' from source: facts 12033 1726867173.42510: Evaluated conditional (ansible_distribution_major_version | int > 9): True 12033 1726867173.42883: variable 'ansible_distribution' from source: facts 12033 1726867173.42887: variable '__network_rh_distros' from source: role '' defaults 12033 1726867173.42892: Evaluated conditional (ansible_distribution in __network_rh_distros): True 12033 1726867173.42924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867173.42953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867173.42983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867173.43032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867173.43048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867173.43093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867173.43117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867173.43147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867173.43187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867173.43210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867173.43263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867173.43297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867173.43349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867173.43374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867173.43397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867173.43786: variable 'network_connections' from source: include params 12033 1726867173.43792: variable 'controller_profile' from source: play vars 12033 1726867173.43810: variable 'controller_profile' from source: play vars 12033 1726867173.43826: variable 'controller_device' from source: play vars 12033 1726867173.43897: variable 'controller_device' from source: play vars 12033 1726867173.43915: variable 'port1_profile' from source: play vars 12033 1726867173.43975: variable 'port1_profile' from source: play vars 12033 1726867173.43993: variable 'dhcp_interface1' from source: play vars 12033 1726867173.44062: variable 'dhcp_interface1' from source: play vars 12033 1726867173.44073: variable 'controller_profile' from source: play vars 12033 1726867173.44144: variable 'controller_profile' from source: play vars 12033 1726867173.44223: variable 'port2_profile' from source: play vars 12033 1726867173.44227: variable 'port2_profile' from source: play vars 12033 1726867173.44239: variable 'dhcp_interface2' from source: play vars 12033 1726867173.44304: variable 'dhcp_interface2' from source: play vars 12033 1726867173.44316: variable 'controller_profile' from source: play vars 12033 1726867173.44397: variable 'controller_profile' from source: play vars 12033 1726867173.44410: variable 'network_state' from source: role '' defaults 12033 1726867173.44488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867173.44672: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867173.44719: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867173.44768: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867173.44799: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867173.44879: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867173.44900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867173.44986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867173.44992: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867173.45009: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 12033 1726867173.45018: when evaluation is False, skipping this task 12033 1726867173.45026: _execute() done 12033 1726867173.45033: dumping result to json 12033 1726867173.45040: done dumping result, returning 12033 1726867173.45053: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-74bb-502b-00000000027b] 12033 1726867173.45063: sending task result for task 0affcac9-a3a5-74bb-502b-00000000027b skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 12033 1726867173.45248: no more pending results, returning what we have 12033 1726867173.45252: results queue empty 12033 1726867173.45253: checking for any_errors_fatal 12033 1726867173.45260: done checking for any_errors_fatal 12033 1726867173.45261: checking for max_fail_percentage 12033 1726867173.45263: done checking for max_fail_percentage 12033 1726867173.45264: checking to see if all hosts have failed and the running result is not ok 12033 1726867173.45264: done checking to see if all hosts have failed 12033 1726867173.45265: getting the remaining hosts for this loop 12033 1726867173.45267: done getting the remaining hosts for this loop 12033 1726867173.45272: getting the next task for host managed_node3 12033 1726867173.45282: done getting next task for host managed_node3 12033 1726867173.45286: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12033 1726867173.45293: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867173.45307: getting variables 12033 1726867173.45309: in VariableManager get_vars() 12033 1726867173.45345: Calling all_inventory to load vars for managed_node3 12033 1726867173.45348: Calling groups_inventory to load vars for managed_node3 12033 1726867173.45351: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867173.45361: Calling all_plugins_play to load vars for managed_node3 12033 1726867173.45363: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867173.45366: Calling groups_plugins_play to load vars for managed_node3 12033 1726867173.45995: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000027b 12033 1726867173.45999: WORKER PROCESS EXITING 12033 1726867173.46991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867173.49763: done with get_vars() 12033 1726867173.49793: done getting variables 12033 1726867173.49886: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:19:33 -0400 (0:00:00.117) 0:00:12.614 ****** 12033 1726867173.49920: entering _queue_task() for managed_node3/dnf 12033 1726867173.50318: worker is 1 (out of 1 available) 12033 1726867173.50329: exiting _queue_task() for managed_node3/dnf 12033 1726867173.50339: done queuing things up, now waiting for results queue to drain 12033 1726867173.50341: waiting for pending results... 12033 1726867173.50511: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12033 1726867173.50652: in run() - task 0affcac9-a3a5-74bb-502b-00000000027c 12033 1726867173.50678: variable 'ansible_search_path' from source: unknown 12033 1726867173.50688: variable 'ansible_search_path' from source: unknown 12033 1726867173.50728: calling self._execute() 12033 1726867173.50900: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867173.50928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867173.50942: variable 'omit' from source: magic vars 12033 1726867173.51683: variable 'ansible_distribution_major_version' from source: facts 12033 1726867173.51688: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867173.52026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867173.57301: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867173.57391: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867173.57484: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867173.57487: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867173.57511: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867173.57602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867173.57638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867173.57674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867173.57730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867173.57750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867173.57920: variable 'ansible_distribution' from source: facts 12033 1726867173.57923: variable 'ansible_distribution_major_version' from source: facts 12033 1726867173.57925: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12033 1726867173.58040: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867173.58187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867173.58224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867173.58256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867173.58308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867173.58355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867173.58379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867173.58415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867173.58445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867173.58525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867173.58528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867173.58557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867173.58593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867173.58621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867173.58681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867173.58693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867173.59114: variable 'network_connections' from source: include params 12033 1726867173.59117: variable 'controller_profile' from source: play vars 12033 1726867173.59287: variable 'controller_profile' from source: play vars 12033 1726867173.59292: variable 'controller_device' from source: play vars 12033 1726867173.59295: variable 'controller_device' from source: play vars 12033 1726867173.59297: variable 'port1_profile' from source: play vars 12033 1726867173.59617: variable 'port1_profile' from source: play vars 12033 1726867173.59724: variable 'dhcp_interface1' from source: play vars 12033 1726867173.59727: variable 'dhcp_interface1' from source: play vars 12033 1726867173.59730: variable 'controller_profile' from source: play vars 12033 1726867173.59771: variable 'controller_profile' from source: play vars 12033 1726867173.59787: variable 'port2_profile' from source: play vars 12033 1726867173.59897: variable 'port2_profile' from source: play vars 12033 1726867173.60051: variable 'dhcp_interface2' from source: play vars 12033 1726867173.60171: variable 'dhcp_interface2' from source: play vars 12033 1726867173.60174: variable 'controller_profile' from source: play vars 12033 1726867173.60507: variable 'controller_profile' from source: play vars 12033 1726867173.60631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867173.61194: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867173.61237: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867173.61498: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867173.61520: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867173.61572: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867173.61759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867173.61933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867173.61966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867173.62237: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867173.63082: variable 'network_connections' from source: include params 12033 1726867173.63085: variable 'controller_profile' from source: play vars 12033 1726867173.63088: variable 'controller_profile' from source: play vars 12033 1726867173.63093: variable 'controller_device' from source: play vars 12033 1726867173.63095: variable 'controller_device' from source: play vars 12033 1726867173.63097: variable 'port1_profile' from source: play vars 12033 1726867173.63291: variable 'port1_profile' from source: play vars 12033 1726867173.63305: variable 'dhcp_interface1' from source: play vars 12033 1726867173.63365: variable 'dhcp_interface1' from source: play vars 12033 1726867173.63380: variable 'controller_profile' from source: play vars 12033 1726867173.63682: variable 'controller_profile' from source: play vars 12033 1726867173.63685: variable 'port2_profile' from source: play vars 12033 1726867173.63718: variable 'port2_profile' from source: play vars 12033 1726867173.63730: variable 'dhcp_interface2' from source: play vars 12033 1726867173.63795: variable 'dhcp_interface2' from source: play vars 12033 1726867173.64082: variable 'controller_profile' from source: play vars 12033 1726867173.64085: variable 'controller_profile' from source: play vars 12033 1726867173.64382: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12033 1726867173.64385: when evaluation is False, skipping this task 12033 1726867173.64388: _execute() done 12033 1726867173.64392: dumping result to json 12033 1726867173.64394: done dumping result, returning 12033 1726867173.64396: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-74bb-502b-00000000027c] 12033 1726867173.64398: sending task result for task 0affcac9-a3a5-74bb-502b-00000000027c 12033 1726867173.64472: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000027c 12033 1726867173.64476: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12033 1726867173.64535: no more pending results, returning what we have 12033 1726867173.64539: results queue empty 12033 1726867173.64540: checking for any_errors_fatal 12033 1726867173.64548: done checking for any_errors_fatal 12033 1726867173.64549: checking for max_fail_percentage 12033 1726867173.64552: done checking for max_fail_percentage 12033 1726867173.64553: checking to see if all hosts have failed and the running result is not ok 12033 1726867173.64554: done checking to see if all hosts have failed 12033 1726867173.64554: getting the remaining hosts for this loop 12033 1726867173.64556: done getting the remaining hosts for this loop 12033 1726867173.64560: getting the next task for host managed_node3 12033 1726867173.64567: done getting next task for host managed_node3 12033 1726867173.64571: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12033 1726867173.64581: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867173.64596: getting variables 12033 1726867173.64598: in VariableManager get_vars() 12033 1726867173.64636: Calling all_inventory to load vars for managed_node3 12033 1726867173.64639: Calling groups_inventory to load vars for managed_node3 12033 1726867173.64642: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867173.64653: Calling all_plugins_play to load vars for managed_node3 12033 1726867173.64656: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867173.64659: Calling groups_plugins_play to load vars for managed_node3 12033 1726867173.67110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867173.69584: done with get_vars() 12033 1726867173.69619: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12033 1726867173.69832: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:19:33 -0400 (0:00:00.199) 0:00:12.814 ****** 12033 1726867173.69865: entering _queue_task() for managed_node3/yum 12033 1726867173.69867: Creating lock for yum 12033 1726867173.70618: worker is 1 (out of 1 available) 12033 1726867173.70693: exiting _queue_task() for managed_node3/yum 12033 1726867173.70707: done queuing things up, now waiting for results queue to drain 12033 1726867173.70709: waiting for pending results... 12033 1726867173.71053: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12033 1726867173.71419: in run() - task 0affcac9-a3a5-74bb-502b-00000000027d 12033 1726867173.71443: variable 'ansible_search_path' from source: unknown 12033 1726867173.71782: variable 'ansible_search_path' from source: unknown 12033 1726867173.71786: calling self._execute() 12033 1726867173.71884: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867173.71887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867173.71892: variable 'omit' from source: magic vars 12033 1726867173.72416: variable 'ansible_distribution_major_version' from source: facts 12033 1726867173.72436: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867173.72770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867173.76321: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867173.76395: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867173.76427: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867173.76460: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867173.76600: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867173.76675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867173.76815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867173.76843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867173.76991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867173.77006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867173.77287: variable 'ansible_distribution_major_version' from source: facts 12033 1726867173.77380: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12033 1726867173.77383: when evaluation is False, skipping this task 12033 1726867173.77386: _execute() done 12033 1726867173.77391: dumping result to json 12033 1726867173.77394: done dumping result, returning 12033 1726867173.77400: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-74bb-502b-00000000027d] 12033 1726867173.77403: sending task result for task 0affcac9-a3a5-74bb-502b-00000000027d skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12033 1726867173.77733: no more pending results, returning what we have 12033 1726867173.77737: results queue empty 12033 1726867173.77738: checking for any_errors_fatal 12033 1726867173.77743: done checking for any_errors_fatal 12033 1726867173.77743: checking for max_fail_percentage 12033 1726867173.77745: done checking for max_fail_percentage 12033 1726867173.77746: checking to see if all hosts have failed and the running result is not ok 12033 1726867173.77747: done checking to see if all hosts have failed 12033 1726867173.77748: getting the remaining hosts for this loop 12033 1726867173.77749: done getting the remaining hosts for this loop 12033 1726867173.77752: getting the next task for host managed_node3 12033 1726867173.77760: done getting next task for host managed_node3 12033 1726867173.77766: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12033 1726867173.77772: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867173.77794: getting variables 12033 1726867173.77796: in VariableManager get_vars() 12033 1726867173.77834: Calling all_inventory to load vars for managed_node3 12033 1726867173.77837: Calling groups_inventory to load vars for managed_node3 12033 1726867173.77839: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867173.77850: Calling all_plugins_play to load vars for managed_node3 12033 1726867173.77853: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867173.77858: Calling groups_plugins_play to load vars for managed_node3 12033 1726867173.78381: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000027d 12033 1726867173.78385: WORKER PROCESS EXITING 12033 1726867173.79613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867173.82413: done with get_vars() 12033 1726867173.82436: done getting variables 12033 1726867173.82508: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:19:33 -0400 (0:00:00.126) 0:00:12.941 ****** 12033 1726867173.82542: entering _queue_task() for managed_node3/fail 12033 1726867173.82861: worker is 1 (out of 1 available) 12033 1726867173.82873: exiting _queue_task() for managed_node3/fail 12033 1726867173.83003: done queuing things up, now waiting for results queue to drain 12033 1726867173.83006: waiting for pending results... 12033 1726867173.83181: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12033 1726867173.83279: in run() - task 0affcac9-a3a5-74bb-502b-00000000027e 12033 1726867173.83291: variable 'ansible_search_path' from source: unknown 12033 1726867173.83297: variable 'ansible_search_path' from source: unknown 12033 1726867173.83330: calling self._execute() 12033 1726867173.83583: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867173.83587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867173.83589: variable 'omit' from source: magic vars 12033 1726867173.83837: variable 'ansible_distribution_major_version' from source: facts 12033 1726867173.83854: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867173.84046: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867173.84325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867173.87189: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867173.87300: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867173.87364: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867173.87416: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867173.87481: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867173.87611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867173.87632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867173.87883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867173.87887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867173.87890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867173.87892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867173.87894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867173.87896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867173.88176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867173.88283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867173.88287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867173.88311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867173.88365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867173.88426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867173.88464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867173.88684: variable 'network_connections' from source: include params 12033 1726867173.88702: variable 'controller_profile' from source: play vars 12033 1726867173.88796: variable 'controller_profile' from source: play vars 12033 1726867173.88814: variable 'controller_device' from source: play vars 12033 1726867173.88882: variable 'controller_device' from source: play vars 12033 1726867173.88910: variable 'port1_profile' from source: play vars 12033 1726867173.89002: variable 'port1_profile' from source: play vars 12033 1726867173.89007: variable 'dhcp_interface1' from source: play vars 12033 1726867173.89075: variable 'dhcp_interface1' from source: play vars 12033 1726867173.89111: variable 'controller_profile' from source: play vars 12033 1726867173.89327: variable 'controller_profile' from source: play vars 12033 1726867173.89331: variable 'port2_profile' from source: play vars 12033 1726867173.89385: variable 'port2_profile' from source: play vars 12033 1726867173.89447: variable 'dhcp_interface2' from source: play vars 12033 1726867173.89523: variable 'dhcp_interface2' from source: play vars 12033 1726867173.89600: variable 'controller_profile' from source: play vars 12033 1726867173.89970: variable 'controller_profile' from source: play vars 12033 1726867173.89974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867173.90065: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867173.90120: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867173.90159: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867173.90204: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867173.90252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867173.90283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867173.90326: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867173.90407: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867173.90448: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867173.90740: variable 'network_connections' from source: include params 12033 1726867173.90752: variable 'controller_profile' from source: play vars 12033 1726867173.90819: variable 'controller_profile' from source: play vars 12033 1726867173.90951: variable 'controller_device' from source: play vars 12033 1726867173.90956: variable 'controller_device' from source: play vars 12033 1726867173.90959: variable 'port1_profile' from source: play vars 12033 1726867173.91000: variable 'port1_profile' from source: play vars 12033 1726867173.91014: variable 'dhcp_interface1' from source: play vars 12033 1726867173.91088: variable 'dhcp_interface1' from source: play vars 12033 1726867173.91100: variable 'controller_profile' from source: play vars 12033 1726867173.91170: variable 'controller_profile' from source: play vars 12033 1726867173.91173: variable 'port2_profile' from source: play vars 12033 1726867173.91237: variable 'port2_profile' from source: play vars 12033 1726867173.91243: variable 'dhcp_interface2' from source: play vars 12033 1726867173.91291: variable 'dhcp_interface2' from source: play vars 12033 1726867173.91300: variable 'controller_profile' from source: play vars 12033 1726867173.91341: variable 'controller_profile' from source: play vars 12033 1726867173.91365: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12033 1726867173.91368: when evaluation is False, skipping this task 12033 1726867173.91370: _execute() done 12033 1726867173.91373: dumping result to json 12033 1726867173.91375: done dumping result, returning 12033 1726867173.91388: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-74bb-502b-00000000027e] 12033 1726867173.91398: sending task result for task 0affcac9-a3a5-74bb-502b-00000000027e 12033 1726867173.91474: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000027e 12033 1726867173.91480: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12033 1726867173.91549: no more pending results, returning what we have 12033 1726867173.91553: results queue empty 12033 1726867173.91554: checking for any_errors_fatal 12033 1726867173.91560: done checking for any_errors_fatal 12033 1726867173.91560: checking for max_fail_percentage 12033 1726867173.91562: done checking for max_fail_percentage 12033 1726867173.91563: checking to see if all hosts have failed and the running result is not ok 12033 1726867173.91563: done checking to see if all hosts have failed 12033 1726867173.91564: getting the remaining hosts for this loop 12033 1726867173.91565: done getting the remaining hosts for this loop 12033 1726867173.91569: getting the next task for host managed_node3 12033 1726867173.91579: done getting next task for host managed_node3 12033 1726867173.91583: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12033 1726867173.91588: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867173.91610: getting variables 12033 1726867173.91611: in VariableManager get_vars() 12033 1726867173.91642: Calling all_inventory to load vars for managed_node3 12033 1726867173.91645: Calling groups_inventory to load vars for managed_node3 12033 1726867173.91647: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867173.91655: Calling all_plugins_play to load vars for managed_node3 12033 1726867173.91657: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867173.91659: Calling groups_plugins_play to load vars for managed_node3 12033 1726867173.92461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867173.93911: done with get_vars() 12033 1726867173.93926: done getting variables 12033 1726867173.93969: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:19:33 -0400 (0:00:00.114) 0:00:13.055 ****** 12033 1726867173.93997: entering _queue_task() for managed_node3/package 12033 1726867173.94262: worker is 1 (out of 1 available) 12033 1726867173.94278: exiting _queue_task() for managed_node3/package 12033 1726867173.94298: done queuing things up, now waiting for results queue to drain 12033 1726867173.94309: waiting for pending results... 12033 1726867173.94514: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 12033 1726867173.94612: in run() - task 0affcac9-a3a5-74bb-502b-00000000027f 12033 1726867173.94635: variable 'ansible_search_path' from source: unknown 12033 1726867173.94638: variable 'ansible_search_path' from source: unknown 12033 1726867173.94686: calling self._execute() 12033 1726867173.94740: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867173.94743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867173.94755: variable 'omit' from source: magic vars 12033 1726867173.95094: variable 'ansible_distribution_major_version' from source: facts 12033 1726867173.95120: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867173.95288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867173.95510: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867173.95540: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867173.95564: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867173.95606: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867173.95686: variable 'network_packages' from source: role '' defaults 12033 1726867173.95770: variable '__network_provider_setup' from source: role '' defaults 12033 1726867173.95780: variable '__network_service_name_default_nm' from source: role '' defaults 12033 1726867173.95833: variable '__network_service_name_default_nm' from source: role '' defaults 12033 1726867173.95869: variable '__network_packages_default_nm' from source: role '' defaults 12033 1726867173.95906: variable '__network_packages_default_nm' from source: role '' defaults 12033 1726867173.96052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867174.01399: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867174.01404: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867174.01419: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867174.01445: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867174.01479: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867174.01546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867174.01627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867174.01630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867174.01660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867174.01681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867174.01729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867174.01752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867174.01775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867174.01815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867174.01825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867174.02032: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12033 1726867174.02131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867174.02155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867174.02176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867174.02303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867174.02306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867174.02309: variable 'ansible_python' from source: facts 12033 1726867174.02322: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12033 1726867174.02401: variable '__network_wpa_supplicant_required' from source: role '' defaults 12033 1726867174.02484: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12033 1726867174.02614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867174.02644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867174.02667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867174.02709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867174.02795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867174.02798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867174.02812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867174.02814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867174.02849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867174.02863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867174.03020: variable 'network_connections' from source: include params 12033 1726867174.03025: variable 'controller_profile' from source: play vars 12033 1726867174.03121: variable 'controller_profile' from source: play vars 12033 1726867174.03131: variable 'controller_device' from source: play vars 12033 1726867174.03236: variable 'controller_device' from source: play vars 12033 1726867174.03250: variable 'port1_profile' from source: play vars 12033 1726867174.03349: variable 'port1_profile' from source: play vars 12033 1726867174.03357: variable 'dhcp_interface1' from source: play vars 12033 1726867174.03467: variable 'dhcp_interface1' from source: play vars 12033 1726867174.03489: variable 'controller_profile' from source: play vars 12033 1726867174.03586: variable 'controller_profile' from source: play vars 12033 1726867174.03605: variable 'port2_profile' from source: play vars 12033 1726867174.03720: variable 'port2_profile' from source: play vars 12033 1726867174.03723: variable 'dhcp_interface2' from source: play vars 12033 1726867174.03822: variable 'dhcp_interface2' from source: play vars 12033 1726867174.03886: variable 'controller_profile' from source: play vars 12033 1726867174.03937: variable 'controller_profile' from source: play vars 12033 1726867174.04026: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867174.04052: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867174.04103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867174.04112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867174.04151: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867174.04457: variable 'network_connections' from source: include params 12033 1726867174.04461: variable 'controller_profile' from source: play vars 12033 1726867174.04559: variable 'controller_profile' from source: play vars 12033 1726867174.04571: variable 'controller_device' from source: play vars 12033 1726867174.04683: variable 'controller_device' from source: play vars 12033 1726867174.04696: variable 'port1_profile' from source: play vars 12033 1726867174.04827: variable 'port1_profile' from source: play vars 12033 1726867174.04831: variable 'dhcp_interface1' from source: play vars 12033 1726867174.04914: variable 'dhcp_interface1' from source: play vars 12033 1726867174.04957: variable 'controller_profile' from source: play vars 12033 1726867174.05024: variable 'controller_profile' from source: play vars 12033 1726867174.05033: variable 'port2_profile' from source: play vars 12033 1726867174.05183: variable 'port2_profile' from source: play vars 12033 1726867174.05186: variable 'dhcp_interface2' from source: play vars 12033 1726867174.05251: variable 'dhcp_interface2' from source: play vars 12033 1726867174.05259: variable 'controller_profile' from source: play vars 12033 1726867174.05356: variable 'controller_profile' from source: play vars 12033 1726867174.05417: variable '__network_packages_default_wireless' from source: role '' defaults 12033 1726867174.05504: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867174.05847: variable 'network_connections' from source: include params 12033 1726867174.05851: variable 'controller_profile' from source: play vars 12033 1726867174.05906: variable 'controller_profile' from source: play vars 12033 1726867174.05909: variable 'controller_device' from source: play vars 12033 1726867174.05974: variable 'controller_device' from source: play vars 12033 1726867174.05988: variable 'port1_profile' from source: play vars 12033 1726867174.06059: variable 'port1_profile' from source: play vars 12033 1726867174.06066: variable 'dhcp_interface1' from source: play vars 12033 1726867174.06140: variable 'dhcp_interface1' from source: play vars 12033 1726867174.06156: variable 'controller_profile' from source: play vars 12033 1726867174.06226: variable 'controller_profile' from source: play vars 12033 1726867174.06229: variable 'port2_profile' from source: play vars 12033 1726867174.06379: variable 'port2_profile' from source: play vars 12033 1726867174.06382: variable 'dhcp_interface2' from source: play vars 12033 1726867174.06385: variable 'dhcp_interface2' from source: play vars 12033 1726867174.06387: variable 'controller_profile' from source: play vars 12033 1726867174.06454: variable 'controller_profile' from source: play vars 12033 1726867174.06489: variable '__network_packages_default_team' from source: role '' defaults 12033 1726867174.06566: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867174.07008: variable 'network_connections' from source: include params 12033 1726867174.07013: variable 'controller_profile' from source: play vars 12033 1726867174.07015: variable 'controller_profile' from source: play vars 12033 1726867174.07017: variable 'controller_device' from source: play vars 12033 1726867174.07073: variable 'controller_device' from source: play vars 12033 1726867174.07081: variable 'port1_profile' from source: play vars 12033 1726867174.07148: variable 'port1_profile' from source: play vars 12033 1726867174.07155: variable 'dhcp_interface1' from source: play vars 12033 1726867174.07225: variable 'dhcp_interface1' from source: play vars 12033 1726867174.07232: variable 'controller_profile' from source: play vars 12033 1726867174.07307: variable 'controller_profile' from source: play vars 12033 1726867174.07317: variable 'port2_profile' from source: play vars 12033 1726867174.07391: variable 'port2_profile' from source: play vars 12033 1726867174.07395: variable 'dhcp_interface2' from source: play vars 12033 1726867174.07462: variable 'dhcp_interface2' from source: play vars 12033 1726867174.07465: variable 'controller_profile' from source: play vars 12033 1726867174.07570: variable 'controller_profile' from source: play vars 12033 1726867174.07596: variable '__network_service_name_default_initscripts' from source: role '' defaults 12033 1726867174.07655: variable '__network_service_name_default_initscripts' from source: role '' defaults 12033 1726867174.07659: variable '__network_packages_default_initscripts' from source: role '' defaults 12033 1726867174.07728: variable '__network_packages_default_initscripts' from source: role '' defaults 12033 1726867174.07993: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12033 1726867174.08363: variable 'network_connections' from source: include params 12033 1726867174.08366: variable 'controller_profile' from source: play vars 12033 1726867174.08429: variable 'controller_profile' from source: play vars 12033 1726867174.08436: variable 'controller_device' from source: play vars 12033 1726867174.08728: variable 'controller_device' from source: play vars 12033 1726867174.08732: variable 'port1_profile' from source: play vars 12033 1726867174.08734: variable 'port1_profile' from source: play vars 12033 1726867174.08737: variable 'dhcp_interface1' from source: play vars 12033 1726867174.08739: variable 'dhcp_interface1' from source: play vars 12033 1726867174.08740: variable 'controller_profile' from source: play vars 12033 1726867174.08742: variable 'controller_profile' from source: play vars 12033 1726867174.08744: variable 'port2_profile' from source: play vars 12033 1726867174.08770: variable 'port2_profile' from source: play vars 12033 1726867174.08778: variable 'dhcp_interface2' from source: play vars 12033 1726867174.08982: variable 'dhcp_interface2' from source: play vars 12033 1726867174.08986: variable 'controller_profile' from source: play vars 12033 1726867174.08988: variable 'controller_profile' from source: play vars 12033 1726867174.08990: variable 'ansible_distribution' from source: facts 12033 1726867174.08992: variable '__network_rh_distros' from source: role '' defaults 12033 1726867174.08994: variable 'ansible_distribution_major_version' from source: facts 12033 1726867174.08996: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12033 1726867174.09115: variable 'ansible_distribution' from source: facts 12033 1726867174.09119: variable '__network_rh_distros' from source: role '' defaults 12033 1726867174.09121: variable 'ansible_distribution_major_version' from source: facts 12033 1726867174.09129: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12033 1726867174.09289: variable 'ansible_distribution' from source: facts 12033 1726867174.09295: variable '__network_rh_distros' from source: role '' defaults 12033 1726867174.09308: variable 'ansible_distribution_major_version' from source: facts 12033 1726867174.09344: variable 'network_provider' from source: set_fact 12033 1726867174.09356: variable 'ansible_facts' from source: unknown 12033 1726867174.10009: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12033 1726867174.10013: when evaluation is False, skipping this task 12033 1726867174.10015: _execute() done 12033 1726867174.10018: dumping result to json 12033 1726867174.10019: done dumping result, returning 12033 1726867174.10041: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-74bb-502b-00000000027f] 12033 1726867174.10044: sending task result for task 0affcac9-a3a5-74bb-502b-00000000027f skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12033 1726867174.10249: no more pending results, returning what we have 12033 1726867174.10252: results queue empty 12033 1726867174.10253: checking for any_errors_fatal 12033 1726867174.10257: done checking for any_errors_fatal 12033 1726867174.10257: checking for max_fail_percentage 12033 1726867174.10259: done checking for max_fail_percentage 12033 1726867174.10260: checking to see if all hosts have failed and the running result is not ok 12033 1726867174.10261: done checking to see if all hosts have failed 12033 1726867174.10261: getting the remaining hosts for this loop 12033 1726867174.10263: done getting the remaining hosts for this loop 12033 1726867174.10266: getting the next task for host managed_node3 12033 1726867174.10272: done getting next task for host managed_node3 12033 1726867174.10276: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12033 1726867174.10283: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867174.10296: getting variables 12033 1726867174.10297: in VariableManager get_vars() 12033 1726867174.10330: Calling all_inventory to load vars for managed_node3 12033 1726867174.10333: Calling groups_inventory to load vars for managed_node3 12033 1726867174.10335: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867174.10344: Calling all_plugins_play to load vars for managed_node3 12033 1726867174.10346: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867174.10349: Calling groups_plugins_play to load vars for managed_node3 12033 1726867174.10897: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000027f 12033 1726867174.10900: WORKER PROCESS EXITING 12033 1726867174.16365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867174.17976: done with get_vars() 12033 1726867174.18001: done getting variables 12033 1726867174.18038: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:19:34 -0400 (0:00:00.240) 0:00:13.296 ****** 12033 1726867174.18059: entering _queue_task() for managed_node3/package 12033 1726867174.18297: worker is 1 (out of 1 available) 12033 1726867174.18313: exiting _queue_task() for managed_node3/package 12033 1726867174.18326: done queuing things up, now waiting for results queue to drain 12033 1726867174.18328: waiting for pending results... 12033 1726867174.18511: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12033 1726867174.18608: in run() - task 0affcac9-a3a5-74bb-502b-000000000280 12033 1726867174.18642: variable 'ansible_search_path' from source: unknown 12033 1726867174.18646: variable 'ansible_search_path' from source: unknown 12033 1726867174.18682: calling self._execute() 12033 1726867174.18757: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867174.18762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867174.18775: variable 'omit' from source: magic vars 12033 1726867174.19117: variable 'ansible_distribution_major_version' from source: facts 12033 1726867174.19127: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867174.19212: variable 'network_state' from source: role '' defaults 12033 1726867174.19220: Evaluated conditional (network_state != {}): False 12033 1726867174.19223: when evaluation is False, skipping this task 12033 1726867174.19230: _execute() done 12033 1726867174.19235: dumping result to json 12033 1726867174.19245: done dumping result, returning 12033 1726867174.19257: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-74bb-502b-000000000280] 12033 1726867174.19260: sending task result for task 0affcac9-a3a5-74bb-502b-000000000280 12033 1726867174.19359: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000280 12033 1726867174.19366: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867174.19598: no more pending results, returning what we have 12033 1726867174.19602: results queue empty 12033 1726867174.19603: checking for any_errors_fatal 12033 1726867174.19608: done checking for any_errors_fatal 12033 1726867174.19609: checking for max_fail_percentage 12033 1726867174.19611: done checking for max_fail_percentage 12033 1726867174.19612: checking to see if all hosts have failed and the running result is not ok 12033 1726867174.19612: done checking to see if all hosts have failed 12033 1726867174.19613: getting the remaining hosts for this loop 12033 1726867174.19614: done getting the remaining hosts for this loop 12033 1726867174.19617: getting the next task for host managed_node3 12033 1726867174.19624: done getting next task for host managed_node3 12033 1726867174.19627: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12033 1726867174.19633: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867174.19647: getting variables 12033 1726867174.19649: in VariableManager get_vars() 12033 1726867174.19680: Calling all_inventory to load vars for managed_node3 12033 1726867174.19683: Calling groups_inventory to load vars for managed_node3 12033 1726867174.19686: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867174.19694: Calling all_plugins_play to load vars for managed_node3 12033 1726867174.19697: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867174.19700: Calling groups_plugins_play to load vars for managed_node3 12033 1726867174.20962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867174.22010: done with get_vars() 12033 1726867174.22024: done getting variables 12033 1726867174.22065: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:19:34 -0400 (0:00:00.040) 0:00:13.336 ****** 12033 1726867174.22091: entering _queue_task() for managed_node3/package 12033 1726867174.22309: worker is 1 (out of 1 available) 12033 1726867174.22322: exiting _queue_task() for managed_node3/package 12033 1726867174.22336: done queuing things up, now waiting for results queue to drain 12033 1726867174.22338: waiting for pending results... 12033 1726867174.22530: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12033 1726867174.22710: in run() - task 0affcac9-a3a5-74bb-502b-000000000281 12033 1726867174.22714: variable 'ansible_search_path' from source: unknown 12033 1726867174.22716: variable 'ansible_search_path' from source: unknown 12033 1726867174.22719: calling self._execute() 12033 1726867174.22862: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867174.22866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867174.22868: variable 'omit' from source: magic vars 12033 1726867174.23383: variable 'ansible_distribution_major_version' from source: facts 12033 1726867174.23387: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867174.23424: variable 'network_state' from source: role '' defaults 12033 1726867174.23440: Evaluated conditional (network_state != {}): False 12033 1726867174.23583: when evaluation is False, skipping this task 12033 1726867174.23586: _execute() done 12033 1726867174.23591: dumping result to json 12033 1726867174.23593: done dumping result, returning 12033 1726867174.23596: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-74bb-502b-000000000281] 12033 1726867174.23598: sending task result for task 0affcac9-a3a5-74bb-502b-000000000281 12033 1726867174.23665: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000281 12033 1726867174.23668: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867174.23717: no more pending results, returning what we have 12033 1726867174.23721: results queue empty 12033 1726867174.23722: checking for any_errors_fatal 12033 1726867174.23729: done checking for any_errors_fatal 12033 1726867174.23730: checking for max_fail_percentage 12033 1726867174.23732: done checking for max_fail_percentage 12033 1726867174.23733: checking to see if all hosts have failed and the running result is not ok 12033 1726867174.23733: done checking to see if all hosts have failed 12033 1726867174.23734: getting the remaining hosts for this loop 12033 1726867174.23736: done getting the remaining hosts for this loop 12033 1726867174.23739: getting the next task for host managed_node3 12033 1726867174.23747: done getting next task for host managed_node3 12033 1726867174.23751: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12033 1726867174.23756: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867174.23772: getting variables 12033 1726867174.23773: in VariableManager get_vars() 12033 1726867174.23925: Calling all_inventory to load vars for managed_node3 12033 1726867174.23928: Calling groups_inventory to load vars for managed_node3 12033 1726867174.23931: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867174.23944: Calling all_plugins_play to load vars for managed_node3 12033 1726867174.23948: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867174.23951: Calling groups_plugins_play to load vars for managed_node3 12033 1726867174.24937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867174.27135: done with get_vars() 12033 1726867174.27158: done getting variables 12033 1726867174.27323: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:19:34 -0400 (0:00:00.052) 0:00:13.389 ****** 12033 1726867174.27354: entering _queue_task() for managed_node3/service 12033 1726867174.27356: Creating lock for service 12033 1726867174.27696: worker is 1 (out of 1 available) 12033 1726867174.27708: exiting _queue_task() for managed_node3/service 12033 1726867174.27721: done queuing things up, now waiting for results queue to drain 12033 1726867174.27770: waiting for pending results... 12033 1726867174.28048: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12033 1726867174.28151: in run() - task 0affcac9-a3a5-74bb-502b-000000000282 12033 1726867174.28162: variable 'ansible_search_path' from source: unknown 12033 1726867174.28165: variable 'ansible_search_path' from source: unknown 12033 1726867174.28197: calling self._execute() 12033 1726867174.28269: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867174.28290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867174.28294: variable 'omit' from source: magic vars 12033 1726867174.28885: variable 'ansible_distribution_major_version' from source: facts 12033 1726867174.28889: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867174.28891: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867174.29038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867174.32232: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867174.32315: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867174.32352: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867174.32392: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867174.32428: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867174.32514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867174.32547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867174.32572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867174.32615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867174.32636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867174.32683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867174.32707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867174.32733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867174.32779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867174.32794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867174.32835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867174.32862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867174.32888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867174.32925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867174.32939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867174.33118: variable 'network_connections' from source: include params 12033 1726867174.33134: variable 'controller_profile' from source: play vars 12033 1726867174.33285: variable 'controller_profile' from source: play vars 12033 1726867174.33291: variable 'controller_device' from source: play vars 12033 1726867174.33294: variable 'controller_device' from source: play vars 12033 1726867174.33299: variable 'port1_profile' from source: play vars 12033 1726867174.33369: variable 'port1_profile' from source: play vars 12033 1726867174.33399: variable 'dhcp_interface1' from source: play vars 12033 1726867174.33443: variable 'dhcp_interface1' from source: play vars 12033 1726867174.33460: variable 'controller_profile' from source: play vars 12033 1726867174.33513: variable 'controller_profile' from source: play vars 12033 1726867174.33519: variable 'port2_profile' from source: play vars 12033 1726867174.33608: variable 'port2_profile' from source: play vars 12033 1726867174.33612: variable 'dhcp_interface2' from source: play vars 12033 1726867174.33782: variable 'dhcp_interface2' from source: play vars 12033 1726867174.33785: variable 'controller_profile' from source: play vars 12033 1726867174.33791: variable 'controller_profile' from source: play vars 12033 1726867174.33794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867174.34043: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867174.34047: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867174.34050: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867174.34065: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867174.34109: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867174.34129: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867174.34160: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867174.34191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867174.34249: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867174.34669: variable 'network_connections' from source: include params 12033 1726867174.34672: variable 'controller_profile' from source: play vars 12033 1726867174.34675: variable 'controller_profile' from source: play vars 12033 1726867174.34679: variable 'controller_device' from source: play vars 12033 1726867174.34772: variable 'controller_device' from source: play vars 12033 1726867174.34775: variable 'port1_profile' from source: play vars 12033 1726867174.34815: variable 'port1_profile' from source: play vars 12033 1726867174.34821: variable 'dhcp_interface1' from source: play vars 12033 1726867174.34899: variable 'dhcp_interface1' from source: play vars 12033 1726867174.34905: variable 'controller_profile' from source: play vars 12033 1726867174.35026: variable 'controller_profile' from source: play vars 12033 1726867174.35029: variable 'port2_profile' from source: play vars 12033 1726867174.35047: variable 'port2_profile' from source: play vars 12033 1726867174.35050: variable 'dhcp_interface2' from source: play vars 12033 1726867174.35133: variable 'dhcp_interface2' from source: play vars 12033 1726867174.35137: variable 'controller_profile' from source: play vars 12033 1726867174.35170: variable 'controller_profile' from source: play vars 12033 1726867174.35207: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12033 1726867174.35210: when evaluation is False, skipping this task 12033 1726867174.35213: _execute() done 12033 1726867174.35215: dumping result to json 12033 1726867174.35217: done dumping result, returning 12033 1726867174.35223: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-74bb-502b-000000000282] 12033 1726867174.35227: sending task result for task 0affcac9-a3a5-74bb-502b-000000000282 12033 1726867174.35464: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000282 12033 1726867174.35467: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12033 1726867174.35540: no more pending results, returning what we have 12033 1726867174.35544: results queue empty 12033 1726867174.35545: checking for any_errors_fatal 12033 1726867174.35550: done checking for any_errors_fatal 12033 1726867174.35550: checking for max_fail_percentage 12033 1726867174.35552: done checking for max_fail_percentage 12033 1726867174.35553: checking to see if all hosts have failed and the running result is not ok 12033 1726867174.35553: done checking to see if all hosts have failed 12033 1726867174.35554: getting the remaining hosts for this loop 12033 1726867174.35555: done getting the remaining hosts for this loop 12033 1726867174.35558: getting the next task for host managed_node3 12033 1726867174.35564: done getting next task for host managed_node3 12033 1726867174.35568: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12033 1726867174.35572: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867174.35592: getting variables 12033 1726867174.35595: in VariableManager get_vars() 12033 1726867174.35632: Calling all_inventory to load vars for managed_node3 12033 1726867174.35635: Calling groups_inventory to load vars for managed_node3 12033 1726867174.35638: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867174.35646: Calling all_plugins_play to load vars for managed_node3 12033 1726867174.35649: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867174.35651: Calling groups_plugins_play to load vars for managed_node3 12033 1726867174.37331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867174.39252: done with get_vars() 12033 1726867174.39284: done getting variables 12033 1726867174.39346: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:19:34 -0400 (0:00:00.120) 0:00:13.509 ****** 12033 1726867174.39388: entering _queue_task() for managed_node3/service 12033 1726867174.39696: worker is 1 (out of 1 available) 12033 1726867174.39709: exiting _queue_task() for managed_node3/service 12033 1726867174.39719: done queuing things up, now waiting for results queue to drain 12033 1726867174.39721: waiting for pending results... 12033 1726867174.40103: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12033 1726867174.40218: in run() - task 0affcac9-a3a5-74bb-502b-000000000283 12033 1726867174.40345: variable 'ansible_search_path' from source: unknown 12033 1726867174.40351: variable 'ansible_search_path' from source: unknown 12033 1726867174.40355: calling self._execute() 12033 1726867174.40370: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867174.40378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867174.40392: variable 'omit' from source: magic vars 12033 1726867174.40818: variable 'ansible_distribution_major_version' from source: facts 12033 1726867174.40830: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867174.41016: variable 'network_provider' from source: set_fact 12033 1726867174.41020: variable 'network_state' from source: role '' defaults 12033 1726867174.41031: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12033 1726867174.41037: variable 'omit' from source: magic vars 12033 1726867174.41112: variable 'omit' from source: magic vars 12033 1726867174.41145: variable 'network_service_name' from source: role '' defaults 12033 1726867174.41231: variable 'network_service_name' from source: role '' defaults 12033 1726867174.41382: variable '__network_provider_setup' from source: role '' defaults 12033 1726867174.41385: variable '__network_service_name_default_nm' from source: role '' defaults 12033 1726867174.41413: variable '__network_service_name_default_nm' from source: role '' defaults 12033 1726867174.41422: variable '__network_packages_default_nm' from source: role '' defaults 12033 1726867174.41498: variable '__network_packages_default_nm' from source: role '' defaults 12033 1726867174.42009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867174.44929: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867174.45009: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867174.45046: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867174.45095: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867174.45282: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867174.45286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867174.45292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867174.45294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867174.45397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867174.45450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867174.45646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867174.45670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867174.45709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867174.45771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867174.45853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867174.46352: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12033 1726867174.46609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867174.46747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867174.46751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867174.46811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867174.46832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867174.47027: variable 'ansible_python' from source: facts 12033 1726867174.47103: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12033 1726867174.47193: variable '__network_wpa_supplicant_required' from source: role '' defaults 12033 1726867174.47491: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12033 1726867174.47723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867174.47746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867174.47769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867174.47897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867174.47919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867174.47964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867174.48030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867174.48056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867174.48159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867174.48163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867174.48472: variable 'network_connections' from source: include params 12033 1726867174.48483: variable 'controller_profile' from source: play vars 12033 1726867174.48640: variable 'controller_profile' from source: play vars 12033 1726867174.48652: variable 'controller_device' from source: play vars 12033 1726867174.48844: variable 'controller_device' from source: play vars 12033 1726867174.48861: variable 'port1_profile' from source: play vars 12033 1726867174.49182: variable 'port1_profile' from source: play vars 12033 1726867174.49185: variable 'dhcp_interface1' from source: play vars 12033 1726867174.49239: variable 'dhcp_interface1' from source: play vars 12033 1726867174.49291: variable 'controller_profile' from source: play vars 12033 1726867174.49397: variable 'controller_profile' from source: play vars 12033 1726867174.49414: variable 'port2_profile' from source: play vars 12033 1726867174.49603: variable 'port2_profile' from source: play vars 12033 1726867174.49614: variable 'dhcp_interface2' from source: play vars 12033 1726867174.49825: variable 'dhcp_interface2' from source: play vars 12033 1726867174.49837: variable 'controller_profile' from source: play vars 12033 1726867174.50033: variable 'controller_profile' from source: play vars 12033 1726867174.50174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867174.50585: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867174.50591: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867174.50594: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867174.50596: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867174.50599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867174.50625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867174.50663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867174.50700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867174.50749: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867174.51149: variable 'network_connections' from source: include params 12033 1726867174.51152: variable 'controller_profile' from source: play vars 12033 1726867174.51155: variable 'controller_profile' from source: play vars 12033 1726867174.51157: variable 'controller_device' from source: play vars 12033 1726867174.51219: variable 'controller_device' from source: play vars 12033 1726867174.51236: variable 'port1_profile' from source: play vars 12033 1726867174.51307: variable 'port1_profile' from source: play vars 12033 1726867174.51328: variable 'dhcp_interface1' from source: play vars 12033 1726867174.51398: variable 'dhcp_interface1' from source: play vars 12033 1726867174.51488: variable 'controller_profile' from source: play vars 12033 1726867174.51684: variable 'controller_profile' from source: play vars 12033 1726867174.51687: variable 'port2_profile' from source: play vars 12033 1726867174.51731: variable 'port2_profile' from source: play vars 12033 1726867174.51742: variable 'dhcp_interface2' from source: play vars 12033 1726867174.51932: variable 'dhcp_interface2' from source: play vars 12033 1726867174.51939: variable 'controller_profile' from source: play vars 12033 1726867174.52151: variable 'controller_profile' from source: play vars 12033 1726867174.52296: variable '__network_packages_default_wireless' from source: role '' defaults 12033 1726867174.52370: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867174.53120: variable 'network_connections' from source: include params 12033 1726867174.53125: variable 'controller_profile' from source: play vars 12033 1726867174.53293: variable 'controller_profile' from source: play vars 12033 1726867174.53297: variable 'controller_device' from source: play vars 12033 1726867174.53539: variable 'controller_device' from source: play vars 12033 1726867174.53542: variable 'port1_profile' from source: play vars 12033 1726867174.53544: variable 'port1_profile' from source: play vars 12033 1726867174.53546: variable 'dhcp_interface1' from source: play vars 12033 1726867174.53554: variable 'dhcp_interface1' from source: play vars 12033 1726867174.53562: variable 'controller_profile' from source: play vars 12033 1726867174.53680: variable 'controller_profile' from source: play vars 12033 1726867174.53684: variable 'port2_profile' from source: play vars 12033 1726867174.53723: variable 'port2_profile' from source: play vars 12033 1726867174.53731: variable 'dhcp_interface2' from source: play vars 12033 1726867174.53802: variable 'dhcp_interface2' from source: play vars 12033 1726867174.53815: variable 'controller_profile' from source: play vars 12033 1726867174.53895: variable 'controller_profile' from source: play vars 12033 1726867174.53911: variable '__network_packages_default_team' from source: role '' defaults 12033 1726867174.54112: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867174.54329: variable 'network_connections' from source: include params 12033 1726867174.54332: variable 'controller_profile' from source: play vars 12033 1726867174.54401: variable 'controller_profile' from source: play vars 12033 1726867174.54409: variable 'controller_device' from source: play vars 12033 1726867174.54582: variable 'controller_device' from source: play vars 12033 1726867174.54585: variable 'port1_profile' from source: play vars 12033 1726867174.54588: variable 'port1_profile' from source: play vars 12033 1726867174.54593: variable 'dhcp_interface1' from source: play vars 12033 1726867174.54642: variable 'dhcp_interface1' from source: play vars 12033 1726867174.54648: variable 'controller_profile' from source: play vars 12033 1726867174.54852: variable 'controller_profile' from source: play vars 12033 1726867174.54855: variable 'port2_profile' from source: play vars 12033 1726867174.54858: variable 'port2_profile' from source: play vars 12033 1726867174.54860: variable 'dhcp_interface2' from source: play vars 12033 1726867174.54881: variable 'dhcp_interface2' from source: play vars 12033 1726867174.54888: variable 'controller_profile' from source: play vars 12033 1726867174.54984: variable 'controller_profile' from source: play vars 12033 1726867174.55030: variable '__network_service_name_default_initscripts' from source: role '' defaults 12033 1726867174.55091: variable '__network_service_name_default_initscripts' from source: role '' defaults 12033 1726867174.55099: variable '__network_packages_default_initscripts' from source: role '' defaults 12033 1726867174.55284: variable '__network_packages_default_initscripts' from source: role '' defaults 12033 1726867174.55954: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12033 1726867174.57258: variable 'network_connections' from source: include params 12033 1726867174.57261: variable 'controller_profile' from source: play vars 12033 1726867174.57439: variable 'controller_profile' from source: play vars 12033 1726867174.57447: variable 'controller_device' from source: play vars 12033 1726867174.57527: variable 'controller_device' from source: play vars 12033 1726867174.57539: variable 'port1_profile' from source: play vars 12033 1726867174.57596: variable 'port1_profile' from source: play vars 12033 1726867174.57602: variable 'dhcp_interface1' from source: play vars 12033 1726867174.57874: variable 'dhcp_interface1' from source: play vars 12033 1726867174.57879: variable 'controller_profile' from source: play vars 12033 1726867174.58008: variable 'controller_profile' from source: play vars 12033 1726867174.58012: variable 'port2_profile' from source: play vars 12033 1726867174.58161: variable 'port2_profile' from source: play vars 12033 1726867174.58168: variable 'dhcp_interface2' from source: play vars 12033 1726867174.58226: variable 'dhcp_interface2' from source: play vars 12033 1726867174.58232: variable 'controller_profile' from source: play vars 12033 1726867174.58403: variable 'controller_profile' from source: play vars 12033 1726867174.58413: variable 'ansible_distribution' from source: facts 12033 1726867174.58416: variable '__network_rh_distros' from source: role '' defaults 12033 1726867174.58421: variable 'ansible_distribution_major_version' from source: facts 12033 1726867174.58447: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12033 1726867174.58723: variable 'ansible_distribution' from source: facts 12033 1726867174.58726: variable '__network_rh_distros' from source: role '' defaults 12033 1726867174.58751: variable 'ansible_distribution_major_version' from source: facts 12033 1726867174.58755: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12033 1726867174.59153: variable 'ansible_distribution' from source: facts 12033 1726867174.59156: variable '__network_rh_distros' from source: role '' defaults 12033 1726867174.59159: variable 'ansible_distribution_major_version' from source: facts 12033 1726867174.59194: variable 'network_provider' from source: set_fact 12033 1726867174.59218: variable 'omit' from source: magic vars 12033 1726867174.59359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867174.59386: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867174.59404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867174.59421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867174.59430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867174.59607: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867174.59811: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867174.59815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867174.59948: Set connection var ansible_pipelining to False 12033 1726867174.59951: Set connection var ansible_shell_executable to /bin/sh 12033 1726867174.59953: Set connection var ansible_timeout to 10 12033 1726867174.59959: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867174.59965: Set connection var ansible_connection to ssh 12033 1726867174.59974: Set connection var ansible_shell_type to sh 12033 1726867174.60007: variable 'ansible_shell_executable' from source: unknown 12033 1726867174.60014: variable 'ansible_connection' from source: unknown 12033 1726867174.60063: variable 'ansible_module_compression' from source: unknown 12033 1726867174.60070: variable 'ansible_shell_type' from source: unknown 12033 1726867174.60076: variable 'ansible_shell_executable' from source: unknown 12033 1726867174.60085: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867174.60095: variable 'ansible_pipelining' from source: unknown 12033 1726867174.60104: variable 'ansible_timeout' from source: unknown 12033 1726867174.60111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867174.60339: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867174.60399: variable 'omit' from source: magic vars 12033 1726867174.60409: starting attempt loop 12033 1726867174.60416: running the handler 12033 1726867174.60546: variable 'ansible_facts' from source: unknown 12033 1726867174.62181: _low_level_execute_command(): starting 12033 1726867174.62259: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867174.63686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867174.63894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867174.64018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867174.64104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867174.65873: stdout chunk (state=3): >>>/root <<< 12033 1726867174.65961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867174.65964: stdout chunk (state=3): >>><<< 12033 1726867174.65966: stderr chunk (state=3): >>><<< 12033 1726867174.65994: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867174.66095: _low_level_execute_command(): starting 12033 1726867174.66098: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867174.6600683-12648-30322871259524 `" && echo ansible-tmp-1726867174.6600683-12648-30322871259524="` echo /root/.ansible/tmp/ansible-tmp-1726867174.6600683-12648-30322871259524 `" ) && sleep 0' 12033 1726867174.66751: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867174.66768: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867174.66793: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867174.66863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867174.66894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867174.66910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867174.66932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867174.67095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867174.68979: stdout chunk (state=3): >>>ansible-tmp-1726867174.6600683-12648-30322871259524=/root/.ansible/tmp/ansible-tmp-1726867174.6600683-12648-30322871259524 <<< 12033 1726867174.69354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867174.69358: stdout chunk (state=3): >>><<< 12033 1726867174.69360: stderr chunk (state=3): >>><<< 12033 1726867174.69363: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867174.6600683-12648-30322871259524=/root/.ansible/tmp/ansible-tmp-1726867174.6600683-12648-30322871259524 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867174.69483: variable 'ansible_module_compression' from source: unknown 12033 1726867174.69596: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 12033 1726867174.69604: ANSIBALLZ: Acquiring lock 12033 1726867174.69607: ANSIBALLZ: Lock acquired: 139897899327968 12033 1726867174.69610: ANSIBALLZ: Creating module 12033 1726867175.11083: ANSIBALLZ: Writing module into payload 12033 1726867175.11285: ANSIBALLZ: Writing module 12033 1726867175.11314: ANSIBALLZ: Renaming module 12033 1726867175.11326: ANSIBALLZ: Done creating module 12033 1726867175.11371: variable 'ansible_facts' from source: unknown 12033 1726867175.11608: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867174.6600683-12648-30322871259524/AnsiballZ_systemd.py 12033 1726867175.11812: Sending initial data 12033 1726867175.11821: Sent initial data (155 bytes) 12033 1726867175.12699: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867175.12806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867175.12920: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867175.12972: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867175.12999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867175.13074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867175.14963: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867175.14968: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867175.15217: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmp3uxaqp59 /root/.ansible/tmp/ansible-tmp-1726867174.6600683-12648-30322871259524/AnsiballZ_systemd.py <<< 12033 1726867175.15222: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867174.6600683-12648-30322871259524/AnsiballZ_systemd.py" <<< 12033 1726867175.15225: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmp3uxaqp59" to remote "/root/.ansible/tmp/ansible-tmp-1726867174.6600683-12648-30322871259524/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867174.6600683-12648-30322871259524/AnsiballZ_systemd.py" <<< 12033 1726867175.16996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867175.16999: stderr chunk (state=3): >>><<< 12033 1726867175.17001: stdout chunk (state=3): >>><<< 12033 1726867175.17003: done transferring module to remote 12033 1726867175.17009: _low_level_execute_command(): starting 12033 1726867175.17011: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867174.6600683-12648-30322871259524/ /root/.ansible/tmp/ansible-tmp-1726867174.6600683-12648-30322871259524/AnsiballZ_systemd.py && sleep 0' 12033 1726867175.17540: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867175.17546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867175.17560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867175.17570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867175.17600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867175.17609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867175.17694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867175.17710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867175.17805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867175.19633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867175.19637: stdout chunk (state=3): >>><<< 12033 1726867175.19639: stderr chunk (state=3): >>><<< 12033 1726867175.19873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867175.19876: _low_level_execute_command(): starting 12033 1726867175.19881: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867174.6600683-12648-30322871259524/AnsiballZ_systemd.py && sleep 0' 12033 1726867175.20893: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867175.20900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867175.20911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867175.20926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867175.20938: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867175.20946: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867175.20960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867175.20970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867175.20979: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867175.20987: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867175.21081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867175.21239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867175.21523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867175.50417: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10293248", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313741824", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "579162000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 12033 1726867175.50467: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12033 1726867175.52493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867175.52497: stdout chunk (state=3): >>><<< 12033 1726867175.52499: stderr chunk (state=3): >>><<< 12033 1726867175.52503: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10293248", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313741824", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "579162000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867175.52550: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867174.6600683-12648-30322871259524/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867175.52571: _low_level_execute_command(): starting 12033 1726867175.52576: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867174.6600683-12648-30322871259524/ > /dev/null 2>&1 && sleep 0' 12033 1726867175.53165: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867175.53175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867175.53191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867175.53204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867175.53217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867175.53224: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867175.53234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867175.53252: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867175.53257: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867175.53264: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867175.53272: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867175.53288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867175.53300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867175.53309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867175.53316: stderr chunk (state=3): >>>debug2: match found <<< 12033 1726867175.53326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867175.53395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867175.53406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867175.53425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867175.53501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867175.55413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867175.55441: stderr chunk (state=3): >>><<< 12033 1726867175.55444: stdout chunk (state=3): >>><<< 12033 1726867175.55682: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867175.55686: handler run complete 12033 1726867175.55692: attempt loop complete, returning result 12033 1726867175.55694: _execute() done 12033 1726867175.55696: dumping result to json 12033 1726867175.55698: done dumping result, returning 12033 1726867175.55701: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-74bb-502b-000000000283] 12033 1726867175.55703: sending task result for task 0affcac9-a3a5-74bb-502b-000000000283 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867175.55960: no more pending results, returning what we have 12033 1726867175.55963: results queue empty 12033 1726867175.55964: checking for any_errors_fatal 12033 1726867175.55971: done checking for any_errors_fatal 12033 1726867175.55971: checking for max_fail_percentage 12033 1726867175.55973: done checking for max_fail_percentage 12033 1726867175.55974: checking to see if all hosts have failed and the running result is not ok 12033 1726867175.55974: done checking to see if all hosts have failed 12033 1726867175.55975: getting the remaining hosts for this loop 12033 1726867175.55976: done getting the remaining hosts for this loop 12033 1726867175.55982: getting the next task for host managed_node3 12033 1726867175.55988: done getting next task for host managed_node3 12033 1726867175.55991: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12033 1726867175.55997: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867175.56008: getting variables 12033 1726867175.56009: in VariableManager get_vars() 12033 1726867175.56039: Calling all_inventory to load vars for managed_node3 12033 1726867175.56041: Calling groups_inventory to load vars for managed_node3 12033 1726867175.56044: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867175.56055: Calling all_plugins_play to load vars for managed_node3 12033 1726867175.56057: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867175.56060: Calling groups_plugins_play to load vars for managed_node3 12033 1726867175.56067: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000283 12033 1726867175.56809: WORKER PROCESS EXITING 12033 1726867175.57658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867175.59265: done with get_vars() 12033 1726867175.59291: done getting variables 12033 1726867175.59363: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:19:35 -0400 (0:00:01.200) 0:00:14.709 ****** 12033 1726867175.59402: entering _queue_task() for managed_node3/service 12033 1726867175.59861: worker is 1 (out of 1 available) 12033 1726867175.59992: exiting _queue_task() for managed_node3/service 12033 1726867175.60003: done queuing things up, now waiting for results queue to drain 12033 1726867175.60005: waiting for pending results... 12033 1726867175.60197: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12033 1726867175.60245: in run() - task 0affcac9-a3a5-74bb-502b-000000000284 12033 1726867175.60265: variable 'ansible_search_path' from source: unknown 12033 1726867175.60285: variable 'ansible_search_path' from source: unknown 12033 1726867175.60341: calling self._execute() 12033 1726867175.60453: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867175.60465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867175.60482: variable 'omit' from source: magic vars 12033 1726867175.61282: variable 'ansible_distribution_major_version' from source: facts 12033 1726867175.61286: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867175.61288: variable 'network_provider' from source: set_fact 12033 1726867175.61293: Evaluated conditional (network_provider == "nm"): True 12033 1726867175.61452: variable '__network_wpa_supplicant_required' from source: role '' defaults 12033 1726867175.61623: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12033 1726867175.61869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867175.64357: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867175.64426: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867175.64468: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867175.64510: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867175.64537: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867175.64611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867175.64640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867175.64664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867175.64708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867175.64728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867175.64779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867175.64811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867175.64840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867175.64883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867175.64903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867175.64943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867175.64967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867175.65081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867175.65085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867175.65087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867175.65174: variable 'network_connections' from source: include params 12033 1726867175.65194: variable 'controller_profile' from source: play vars 12033 1726867175.65255: variable 'controller_profile' from source: play vars 12033 1726867175.65269: variable 'controller_device' from source: play vars 12033 1726867175.65332: variable 'controller_device' from source: play vars 12033 1726867175.65349: variable 'port1_profile' from source: play vars 12033 1726867175.65413: variable 'port1_profile' from source: play vars 12033 1726867175.65426: variable 'dhcp_interface1' from source: play vars 12033 1726867175.65488: variable 'dhcp_interface1' from source: play vars 12033 1726867175.65502: variable 'controller_profile' from source: play vars 12033 1726867175.65562: variable 'controller_profile' from source: play vars 12033 1726867175.65575: variable 'port2_profile' from source: play vars 12033 1726867175.65641: variable 'port2_profile' from source: play vars 12033 1726867175.65654: variable 'dhcp_interface2' from source: play vars 12033 1726867175.65718: variable 'dhcp_interface2' from source: play vars 12033 1726867175.65882: variable 'controller_profile' from source: play vars 12033 1726867175.65885: variable 'controller_profile' from source: play vars 12033 1726867175.65888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867175.66038: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867175.66080: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867175.66117: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867175.66150: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867175.66198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867175.66225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867175.66255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867175.66298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867175.66349: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867175.66586: variable 'network_connections' from source: include params 12033 1726867175.66601: variable 'controller_profile' from source: play vars 12033 1726867175.66662: variable 'controller_profile' from source: play vars 12033 1726867175.66674: variable 'controller_device' from source: play vars 12033 1726867175.66740: variable 'controller_device' from source: play vars 12033 1726867175.66756: variable 'port1_profile' from source: play vars 12033 1726867175.66820: variable 'port1_profile' from source: play vars 12033 1726867175.66832: variable 'dhcp_interface1' from source: play vars 12033 1726867175.66899: variable 'dhcp_interface1' from source: play vars 12033 1726867175.66914: variable 'controller_profile' from source: play vars 12033 1726867175.66973: variable 'controller_profile' from source: play vars 12033 1726867175.66991: variable 'port2_profile' from source: play vars 12033 1726867175.67051: variable 'port2_profile' from source: play vars 12033 1726867175.67063: variable 'dhcp_interface2' from source: play vars 12033 1726867175.67126: variable 'dhcp_interface2' from source: play vars 12033 1726867175.67282: variable 'controller_profile' from source: play vars 12033 1726867175.67284: variable 'controller_profile' from source: play vars 12033 1726867175.67287: Evaluated conditional (__network_wpa_supplicant_required): False 12033 1726867175.67291: when evaluation is False, skipping this task 12033 1726867175.67293: _execute() done 12033 1726867175.67295: dumping result to json 12033 1726867175.67297: done dumping result, returning 12033 1726867175.67299: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-74bb-502b-000000000284] 12033 1726867175.67301: sending task result for task 0affcac9-a3a5-74bb-502b-000000000284 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12033 1726867175.67421: no more pending results, returning what we have 12033 1726867175.67425: results queue empty 12033 1726867175.67426: checking for any_errors_fatal 12033 1726867175.67564: done checking for any_errors_fatal 12033 1726867175.67566: checking for max_fail_percentage 12033 1726867175.67568: done checking for max_fail_percentage 12033 1726867175.67569: checking to see if all hosts have failed and the running result is not ok 12033 1726867175.67569: done checking to see if all hosts have failed 12033 1726867175.67570: getting the remaining hosts for this loop 12033 1726867175.67572: done getting the remaining hosts for this loop 12033 1726867175.67575: getting the next task for host managed_node3 12033 1726867175.67583: done getting next task for host managed_node3 12033 1726867175.67586: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12033 1726867175.67590: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867175.67603: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000284 12033 1726867175.67606: WORKER PROCESS EXITING 12033 1726867175.67941: getting variables 12033 1726867175.67944: in VariableManager get_vars() 12033 1726867175.67976: Calling all_inventory to load vars for managed_node3 12033 1726867175.67982: Calling groups_inventory to load vars for managed_node3 12033 1726867175.67984: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867175.67995: Calling all_plugins_play to load vars for managed_node3 12033 1726867175.67998: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867175.68001: Calling groups_plugins_play to load vars for managed_node3 12033 1726867175.69408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867175.72190: done with get_vars() 12033 1726867175.72220: done getting variables 12033 1726867175.72288: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:19:35 -0400 (0:00:00.129) 0:00:14.839 ****** 12033 1726867175.72327: entering _queue_task() for managed_node3/service 12033 1726867175.72692: worker is 1 (out of 1 available) 12033 1726867175.72706: exiting _queue_task() for managed_node3/service 12033 1726867175.72718: done queuing things up, now waiting for results queue to drain 12033 1726867175.72719: waiting for pending results... 12033 1726867175.73197: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 12033 1726867175.73203: in run() - task 0affcac9-a3a5-74bb-502b-000000000285 12033 1726867175.73206: variable 'ansible_search_path' from source: unknown 12033 1726867175.73208: variable 'ansible_search_path' from source: unknown 12033 1726867175.73211: calling self._execute() 12033 1726867175.73275: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867175.73283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867175.73300: variable 'omit' from source: magic vars 12033 1726867175.73700: variable 'ansible_distribution_major_version' from source: facts 12033 1726867175.73708: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867175.73842: variable 'network_provider' from source: set_fact 12033 1726867175.73867: Evaluated conditional (network_provider == "initscripts"): False 12033 1726867175.73871: when evaluation is False, skipping this task 12033 1726867175.73875: _execute() done 12033 1726867175.73879: dumping result to json 12033 1726867175.73883: done dumping result, returning 12033 1726867175.73917: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-74bb-502b-000000000285] 12033 1726867175.73920: sending task result for task 0affcac9-a3a5-74bb-502b-000000000285 12033 1726867175.73984: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000285 12033 1726867175.73986: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867175.74039: no more pending results, returning what we have 12033 1726867175.74044: results queue empty 12033 1726867175.74045: checking for any_errors_fatal 12033 1726867175.74062: done checking for any_errors_fatal 12033 1726867175.74063: checking for max_fail_percentage 12033 1726867175.74066: done checking for max_fail_percentage 12033 1726867175.74067: checking to see if all hosts have failed and the running result is not ok 12033 1726867175.74067: done checking to see if all hosts have failed 12033 1726867175.74068: getting the remaining hosts for this loop 12033 1726867175.74070: done getting the remaining hosts for this loop 12033 1726867175.74074: getting the next task for host managed_node3 12033 1726867175.74087: done getting next task for host managed_node3 12033 1726867175.74090: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12033 1726867175.74096: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867175.74112: getting variables 12033 1726867175.74113: in VariableManager get_vars() 12033 1726867175.74149: Calling all_inventory to load vars for managed_node3 12033 1726867175.74153: Calling groups_inventory to load vars for managed_node3 12033 1726867175.74155: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867175.74285: Calling all_plugins_play to load vars for managed_node3 12033 1726867175.74293: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867175.74301: Calling groups_plugins_play to load vars for managed_node3 12033 1726867175.76299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867175.78157: done with get_vars() 12033 1726867175.78180: done getting variables 12033 1726867175.78242: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:19:35 -0400 (0:00:00.059) 0:00:14.898 ****** 12033 1726867175.78276: entering _queue_task() for managed_node3/copy 12033 1726867175.78587: worker is 1 (out of 1 available) 12033 1726867175.78603: exiting _queue_task() for managed_node3/copy 12033 1726867175.78616: done queuing things up, now waiting for results queue to drain 12033 1726867175.78618: waiting for pending results... 12033 1726867175.78886: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12033 1726867175.79485: in run() - task 0affcac9-a3a5-74bb-502b-000000000286 12033 1726867175.79491: variable 'ansible_search_path' from source: unknown 12033 1726867175.79494: variable 'ansible_search_path' from source: unknown 12033 1726867175.79497: calling self._execute() 12033 1726867175.79499: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867175.79502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867175.79505: variable 'omit' from source: magic vars 12033 1726867175.79817: variable 'ansible_distribution_major_version' from source: facts 12033 1726867175.79828: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867175.79955: variable 'network_provider' from source: set_fact 12033 1726867175.79961: Evaluated conditional (network_provider == "initscripts"): False 12033 1726867175.79965: when evaluation is False, skipping this task 12033 1726867175.79967: _execute() done 12033 1726867175.79969: dumping result to json 12033 1726867175.79971: done dumping result, returning 12033 1726867175.79983: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-74bb-502b-000000000286] 12033 1726867175.79988: sending task result for task 0affcac9-a3a5-74bb-502b-000000000286 12033 1726867175.80093: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000286 12033 1726867175.80095: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12033 1726867175.80170: no more pending results, returning what we have 12033 1726867175.80174: results queue empty 12033 1726867175.80175: checking for any_errors_fatal 12033 1726867175.80183: done checking for any_errors_fatal 12033 1726867175.80184: checking for max_fail_percentage 12033 1726867175.80186: done checking for max_fail_percentage 12033 1726867175.80187: checking to see if all hosts have failed and the running result is not ok 12033 1726867175.80187: done checking to see if all hosts have failed 12033 1726867175.80188: getting the remaining hosts for this loop 12033 1726867175.80192: done getting the remaining hosts for this loop 12033 1726867175.80195: getting the next task for host managed_node3 12033 1726867175.80202: done getting next task for host managed_node3 12033 1726867175.80205: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12033 1726867175.80210: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867175.80222: getting variables 12033 1726867175.80224: in VariableManager get_vars() 12033 1726867175.80252: Calling all_inventory to load vars for managed_node3 12033 1726867175.80255: Calling groups_inventory to load vars for managed_node3 12033 1726867175.80257: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867175.80265: Calling all_plugins_play to load vars for managed_node3 12033 1726867175.80267: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867175.80270: Calling groups_plugins_play to load vars for managed_node3 12033 1726867175.81703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867175.83316: done with get_vars() 12033 1726867175.83335: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:19:35 -0400 (0:00:00.051) 0:00:14.949 ****** 12033 1726867175.83417: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 12033 1726867175.83419: Creating lock for fedora.linux_system_roles.network_connections 12033 1726867175.83685: worker is 1 (out of 1 available) 12033 1726867175.83700: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 12033 1726867175.83713: done queuing things up, now waiting for results queue to drain 12033 1726867175.83715: waiting for pending results... 12033 1726867175.84257: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12033 1726867175.84504: in run() - task 0affcac9-a3a5-74bb-502b-000000000287 12033 1726867175.84518: variable 'ansible_search_path' from source: unknown 12033 1726867175.84521: variable 'ansible_search_path' from source: unknown 12033 1726867175.84617: calling self._execute() 12033 1726867175.84742: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867175.84780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867175.84783: variable 'omit' from source: magic vars 12033 1726867175.85250: variable 'ansible_distribution_major_version' from source: facts 12033 1726867175.85253: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867175.85256: variable 'omit' from source: magic vars 12033 1726867175.85294: variable 'omit' from source: magic vars 12033 1726867175.85460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867175.88882: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867175.88936: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867175.89230: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867175.89233: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867175.89236: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867175.89373: variable 'network_provider' from source: set_fact 12033 1726867175.89527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867175.89571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867175.89606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867175.89652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867175.89676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867175.89763: variable 'omit' from source: magic vars 12033 1726867175.89869: variable 'omit' from source: magic vars 12033 1726867175.90001: variable 'network_connections' from source: include params 12033 1726867175.90017: variable 'controller_profile' from source: play vars 12033 1726867175.90082: variable 'controller_profile' from source: play vars 12033 1726867175.90103: variable 'controller_device' from source: play vars 12033 1726867175.90163: variable 'controller_device' from source: play vars 12033 1726867175.90194: variable 'port1_profile' from source: play vars 12033 1726867175.90259: variable 'port1_profile' from source: play vars 12033 1726867175.90272: variable 'dhcp_interface1' from source: play vars 12033 1726867175.90342: variable 'dhcp_interface1' from source: play vars 12033 1726867175.90424: variable 'controller_profile' from source: play vars 12033 1726867175.90427: variable 'controller_profile' from source: play vars 12033 1726867175.90429: variable 'port2_profile' from source: play vars 12033 1726867175.90491: variable 'port2_profile' from source: play vars 12033 1726867175.90505: variable 'dhcp_interface2' from source: play vars 12033 1726867175.90572: variable 'dhcp_interface2' from source: play vars 12033 1726867175.90587: variable 'controller_profile' from source: play vars 12033 1726867175.90657: variable 'controller_profile' from source: play vars 12033 1726867175.90892: variable 'omit' from source: magic vars 12033 1726867175.90908: variable '__lsr_ansible_managed' from source: task vars 12033 1726867175.90980: variable '__lsr_ansible_managed' from source: task vars 12033 1726867175.91162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12033 1726867175.92339: Loaded config def from plugin (lookup/template) 12033 1726867175.92343: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12033 1726867175.92360: File lookup term: get_ansible_managed.j2 12033 1726867175.92369: variable 'ansible_search_path' from source: unknown 12033 1726867175.92382: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12033 1726867175.92403: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12033 1726867175.92427: variable 'ansible_search_path' from source: unknown 12033 1726867175.98529: variable 'ansible_managed' from source: unknown 12033 1726867175.98653: variable 'omit' from source: magic vars 12033 1726867175.98684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867175.98713: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867175.98737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867175.98782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867175.98785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867175.98802: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867175.98809: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867175.98815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867175.98908: Set connection var ansible_pipelining to False 12033 1726867175.98951: Set connection var ansible_shell_executable to /bin/sh 12033 1726867175.98953: Set connection var ansible_timeout to 10 12033 1726867175.98955: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867175.98964: Set connection var ansible_connection to ssh 12033 1726867175.98966: Set connection var ansible_shell_type to sh 12033 1726867175.98983: variable 'ansible_shell_executable' from source: unknown 12033 1726867175.98992: variable 'ansible_connection' from source: unknown 12033 1726867175.99182: variable 'ansible_module_compression' from source: unknown 12033 1726867175.99185: variable 'ansible_shell_type' from source: unknown 12033 1726867175.99188: variable 'ansible_shell_executable' from source: unknown 12033 1726867175.99193: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867175.99195: variable 'ansible_pipelining' from source: unknown 12033 1726867175.99197: variable 'ansible_timeout' from source: unknown 12033 1726867175.99199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867175.99202: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867175.99204: variable 'omit' from source: magic vars 12033 1726867175.99206: starting attempt loop 12033 1726867175.99208: running the handler 12033 1726867175.99210: _low_level_execute_command(): starting 12033 1726867175.99212: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867175.99982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867176.00028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867176.00081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867176.01780: stdout chunk (state=3): >>>/root <<< 12033 1726867176.01939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867176.01944: stdout chunk (state=3): >>><<< 12033 1726867176.01946: stderr chunk (state=3): >>><<< 12033 1726867176.01971: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867176.02069: _low_level_execute_command(): starting 12033 1726867176.02074: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867176.0197904-12714-211204179339811 `" && echo ansible-tmp-1726867176.0197904-12714-211204179339811="` echo /root/.ansible/tmp/ansible-tmp-1726867176.0197904-12714-211204179339811 `" ) && sleep 0' 12033 1726867176.02660: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867176.02672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867176.02686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867176.02705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867176.02728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867176.02822: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867176.02854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867176.02925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867176.04818: stdout chunk (state=3): >>>ansible-tmp-1726867176.0197904-12714-211204179339811=/root/.ansible/tmp/ansible-tmp-1726867176.0197904-12714-211204179339811 <<< 12033 1726867176.04932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867176.04998: stderr chunk (state=3): >>><<< 12033 1726867176.05028: stdout chunk (state=3): >>><<< 12033 1726867176.05130: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867176.0197904-12714-211204179339811=/root/.ansible/tmp/ansible-tmp-1726867176.0197904-12714-211204179339811 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867176.05238: variable 'ansible_module_compression' from source: unknown 12033 1726867176.05492: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 12033 1726867176.05496: ANSIBALLZ: Acquiring lock 12033 1726867176.05498: ANSIBALLZ: Lock acquired: 139897898374288 12033 1726867176.05501: ANSIBALLZ: Creating module 12033 1726867176.44695: ANSIBALLZ: Writing module into payload 12033 1726867176.45056: ANSIBALLZ: Writing module 12033 1726867176.45084: ANSIBALLZ: Renaming module 12033 1726867176.45094: ANSIBALLZ: Done creating module 12033 1726867176.45132: variable 'ansible_facts' from source: unknown 12033 1726867176.45307: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867176.0197904-12714-211204179339811/AnsiballZ_network_connections.py 12033 1726867176.45591: Sending initial data 12033 1726867176.45598: Sent initial data (168 bytes) 12033 1726867176.46585: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867176.46669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867176.46689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867176.46715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867176.46802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867176.48468: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867176.48511: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867176.48564: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpo4bkavr5 /root/.ansible/tmp/ansible-tmp-1726867176.0197904-12714-211204179339811/AnsiballZ_network_connections.py <<< 12033 1726867176.48568: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867176.0197904-12714-211204179339811/AnsiballZ_network_connections.py" <<< 12033 1726867176.48612: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpo4bkavr5" to remote "/root/.ansible/tmp/ansible-tmp-1726867176.0197904-12714-211204179339811/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867176.0197904-12714-211204179339811/AnsiballZ_network_connections.py" <<< 12033 1726867176.49907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867176.49911: stdout chunk (state=3): >>><<< 12033 1726867176.49913: stderr chunk (state=3): >>><<< 12033 1726867176.49915: done transferring module to remote 12033 1726867176.49917: _low_level_execute_command(): starting 12033 1726867176.49920: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867176.0197904-12714-211204179339811/ /root/.ansible/tmp/ansible-tmp-1726867176.0197904-12714-211204179339811/AnsiballZ_network_connections.py && sleep 0' 12033 1726867176.50526: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867176.50529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867176.50541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867176.50610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867176.50646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867176.50674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867176.50720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867176.52606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867176.52609: stdout chunk (state=3): >>><<< 12033 1726867176.52611: stderr chunk (state=3): >>><<< 12033 1726867176.52614: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867176.52616: _low_level_execute_command(): starting 12033 1726867176.52618: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867176.0197904-12714-211204179339811/AnsiballZ_network_connections.py && sleep 0' 12033 1726867176.53219: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867176.53252: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867176.53295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867176.53415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867176.53418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867176.53421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867176.53480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867176.94082: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, de3a6889-a8bc-4195-81cc-ec6220008b92\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 014cff9e-2499-4f69-88d3-e2ba3869747a\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5369e4ef-1a37-4dbb-886a-05f2f96cb3c2\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, de3a6889-a8bc-4195-81cc-ec6220008b92 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 014cff9e-2499-4f69-88d3-e2ba3869747a (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5369e4ef-1a37-4dbb-886a-05f2f96cb3c2 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12033 1726867176.95958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867176.95991: stdout chunk (state=3): >>><<< 12033 1726867176.96003: stderr chunk (state=3): >>><<< 12033 1726867176.96188: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, de3a6889-a8bc-4195-81cc-ec6220008b92\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 014cff9e-2499-4f69-88d3-e2ba3869747a\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5369e4ef-1a37-4dbb-886a-05f2f96cb3c2\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, de3a6889-a8bc-4195-81cc-ec6220008b92 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 014cff9e-2499-4f69-88d3-e2ba3869747a (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5369e4ef-1a37-4dbb-886a-05f2f96cb3c2 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867176.96196: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': '802.3ad', 'ad_actor_sys_prio': 65535, 'ad_actor_system': '00:00:5e:00:53:5d', 'ad_select': 'stable', 'ad_user_port_key': 1023, 'all_ports_active': True, 'downdelay': 0, 'lacp_rate': 'slow', 'lp_interval': 128, 'miimon': 110, 'min_links': 0, 'num_grat_arp': 64, 'primary_reselect': 'better', 'resend_igmp': 225, 'updelay': 0, 'use_carrier': True, 'xmit_hash_policy': 'encap2+3'}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867176.0197904-12714-211204179339811/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867176.96291: _low_level_execute_command(): starting 12033 1726867176.96382: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867176.0197904-12714-211204179339811/ > /dev/null 2>&1 && sleep 0' 12033 1726867176.97888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867176.98132: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867176.98193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867176.98268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867177.00484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867177.00495: stdout chunk (state=3): >>><<< 12033 1726867177.00508: stderr chunk (state=3): >>><<< 12033 1726867177.00684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867177.00688: handler run complete 12033 1726867177.00720: attempt loop complete, returning result 12033 1726867177.00727: _execute() done 12033 1726867177.00733: dumping result to json 12033 1726867177.00750: done dumping result, returning 12033 1726867177.00763: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-74bb-502b-000000000287] 12033 1726867177.00772: sending task result for task 0affcac9-a3a5-74bb-502b-000000000287 changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, de3a6889-a8bc-4195-81cc-ec6220008b92 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 014cff9e-2499-4f69-88d3-e2ba3869747a [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5369e4ef-1a37-4dbb-886a-05f2f96cb3c2 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, de3a6889-a8bc-4195-81cc-ec6220008b92 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 014cff9e-2499-4f69-88d3-e2ba3869747a (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5369e4ef-1a37-4dbb-886a-05f2f96cb3c2 (not-active) 12033 1726867177.01081: no more pending results, returning what we have 12033 1726867177.01085: results queue empty 12033 1726867177.01086: checking for any_errors_fatal 12033 1726867177.01098: done checking for any_errors_fatal 12033 1726867177.01099: checking for max_fail_percentage 12033 1726867177.01101: done checking for max_fail_percentage 12033 1726867177.01102: checking to see if all hosts have failed and the running result is not ok 12033 1726867177.01103: done checking to see if all hosts have failed 12033 1726867177.01103: getting the remaining hosts for this loop 12033 1726867177.01105: done getting the remaining hosts for this loop 12033 1726867177.01109: getting the next task for host managed_node3 12033 1726867177.01117: done getting next task for host managed_node3 12033 1726867177.01120: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12033 1726867177.01124: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867177.01135: getting variables 12033 1726867177.01137: in VariableManager get_vars() 12033 1726867177.01173: Calling all_inventory to load vars for managed_node3 12033 1726867177.01176: Calling groups_inventory to load vars for managed_node3 12033 1726867177.01383: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867177.01397: Calling all_plugins_play to load vars for managed_node3 12033 1726867177.01400: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867177.01404: Calling groups_plugins_play to load vars for managed_node3 12033 1726867177.01926: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000287 12033 1726867177.01930: WORKER PROCESS EXITING 12033 1726867177.04204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867177.07362: done with get_vars() 12033 1726867177.07595: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:19:37 -0400 (0:00:01.242) 0:00:16.192 ****** 12033 1726867177.07686: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 12033 1726867177.07688: Creating lock for fedora.linux_system_roles.network_state 12033 1726867177.08612: worker is 1 (out of 1 available) 12033 1726867177.08621: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 12033 1726867177.08632: done queuing things up, now waiting for results queue to drain 12033 1726867177.08633: waiting for pending results... 12033 1726867177.09003: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 12033 1726867177.09115: in run() - task 0affcac9-a3a5-74bb-502b-000000000288 12033 1726867177.09139: variable 'ansible_search_path' from source: unknown 12033 1726867177.09147: variable 'ansible_search_path' from source: unknown 12033 1726867177.09240: calling self._execute() 12033 1726867177.09667: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867177.09671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867177.09674: variable 'omit' from source: magic vars 12033 1726867177.10181: variable 'ansible_distribution_major_version' from source: facts 12033 1726867177.10307: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867177.10549: variable 'network_state' from source: role '' defaults 12033 1726867177.10565: Evaluated conditional (network_state != {}): False 12033 1726867177.10572: when evaluation is False, skipping this task 12033 1726867177.10583: _execute() done 12033 1726867177.10589: dumping result to json 12033 1726867177.10596: done dumping result, returning 12033 1726867177.10606: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-74bb-502b-000000000288] 12033 1726867177.10615: sending task result for task 0affcac9-a3a5-74bb-502b-000000000288 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867177.10827: no more pending results, returning what we have 12033 1726867177.10831: results queue empty 12033 1726867177.10832: checking for any_errors_fatal 12033 1726867177.10847: done checking for any_errors_fatal 12033 1726867177.10848: checking for max_fail_percentage 12033 1726867177.10849: done checking for max_fail_percentage 12033 1726867177.10850: checking to see if all hosts have failed and the running result is not ok 12033 1726867177.10850: done checking to see if all hosts have failed 12033 1726867177.10851: getting the remaining hosts for this loop 12033 1726867177.10853: done getting the remaining hosts for this loop 12033 1726867177.10856: getting the next task for host managed_node3 12033 1726867177.10864: done getting next task for host managed_node3 12033 1726867177.10867: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12033 1726867177.10872: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867177.10898: getting variables 12033 1726867177.10900: in VariableManager get_vars() 12033 1726867177.10937: Calling all_inventory to load vars for managed_node3 12033 1726867177.10940: Calling groups_inventory to load vars for managed_node3 12033 1726867177.10942: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867177.10955: Calling all_plugins_play to load vars for managed_node3 12033 1726867177.10958: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867177.10961: Calling groups_plugins_play to load vars for managed_node3 12033 1726867177.12048: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000288 12033 1726867177.12052: WORKER PROCESS EXITING 12033 1726867177.14871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867177.19039: done with get_vars() 12033 1726867177.19069: done getting variables 12033 1726867177.19357: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:19:37 -0400 (0:00:00.117) 0:00:16.309 ****** 12033 1726867177.19397: entering _queue_task() for managed_node3/debug 12033 1726867177.20195: worker is 1 (out of 1 available) 12033 1726867177.20206: exiting _queue_task() for managed_node3/debug 12033 1726867177.20217: done queuing things up, now waiting for results queue to drain 12033 1726867177.20218: waiting for pending results... 12033 1726867177.20653: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12033 1726867177.21383: in run() - task 0affcac9-a3a5-74bb-502b-000000000289 12033 1726867177.21406: variable 'ansible_search_path' from source: unknown 12033 1726867177.21410: variable 'ansible_search_path' from source: unknown 12033 1726867177.21446: calling self._execute() 12033 1726867177.21543: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867177.21548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867177.21559: variable 'omit' from source: magic vars 12033 1726867177.21947: variable 'ansible_distribution_major_version' from source: facts 12033 1726867177.21957: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867177.21963: variable 'omit' from source: magic vars 12033 1726867177.22027: variable 'omit' from source: magic vars 12033 1726867177.22069: variable 'omit' from source: magic vars 12033 1726867177.22116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867177.22154: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867177.22182: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867177.22200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867177.22283: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867177.22286: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867177.22289: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867177.22291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867177.22359: Set connection var ansible_pipelining to False 12033 1726867177.22368: Set connection var ansible_shell_executable to /bin/sh 12033 1726867177.22387: Set connection var ansible_timeout to 10 12033 1726867177.22395: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867177.22398: Set connection var ansible_connection to ssh 12033 1726867177.22403: Set connection var ansible_shell_type to sh 12033 1726867177.22425: variable 'ansible_shell_executable' from source: unknown 12033 1726867177.22428: variable 'ansible_connection' from source: unknown 12033 1726867177.22431: variable 'ansible_module_compression' from source: unknown 12033 1726867177.22433: variable 'ansible_shell_type' from source: unknown 12033 1726867177.22439: variable 'ansible_shell_executable' from source: unknown 12033 1726867177.22441: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867177.22444: variable 'ansible_pipelining' from source: unknown 12033 1726867177.22446: variable 'ansible_timeout' from source: unknown 12033 1726867177.22448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867177.22602: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867177.22636: variable 'omit' from source: magic vars 12033 1726867177.22639: starting attempt loop 12033 1726867177.22642: running the handler 12033 1726867177.22782: variable '__network_connections_result' from source: set_fact 12033 1726867177.22849: handler run complete 12033 1726867177.22869: attempt loop complete, returning result 12033 1726867177.22873: _execute() done 12033 1726867177.22880: dumping result to json 12033 1726867177.22883: done dumping result, returning 12033 1726867177.22984: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-74bb-502b-000000000289] 12033 1726867177.22988: sending task result for task 0affcac9-a3a5-74bb-502b-000000000289 12033 1726867177.23206: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000289 12033 1726867177.23210: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, de3a6889-a8bc-4195-81cc-ec6220008b92", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 014cff9e-2499-4f69-88d3-e2ba3869747a", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5369e4ef-1a37-4dbb-886a-05f2f96cb3c2", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, de3a6889-a8bc-4195-81cc-ec6220008b92 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 014cff9e-2499-4f69-88d3-e2ba3869747a (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5369e4ef-1a37-4dbb-886a-05f2f96cb3c2 (not-active)" ] } 12033 1726867177.23309: no more pending results, returning what we have 12033 1726867177.23314: results queue empty 12033 1726867177.23315: checking for any_errors_fatal 12033 1726867177.23323: done checking for any_errors_fatal 12033 1726867177.23323: checking for max_fail_percentage 12033 1726867177.23326: done checking for max_fail_percentage 12033 1726867177.23326: checking to see if all hosts have failed and the running result is not ok 12033 1726867177.23327: done checking to see if all hosts have failed 12033 1726867177.23328: getting the remaining hosts for this loop 12033 1726867177.23330: done getting the remaining hosts for this loop 12033 1726867177.23334: getting the next task for host managed_node3 12033 1726867177.23343: done getting next task for host managed_node3 12033 1726867177.23347: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12033 1726867177.23352: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867177.23364: getting variables 12033 1726867177.23366: in VariableManager get_vars() 12033 1726867177.23403: Calling all_inventory to load vars for managed_node3 12033 1726867177.23406: Calling groups_inventory to load vars for managed_node3 12033 1726867177.23408: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867177.23419: Calling all_plugins_play to load vars for managed_node3 12033 1726867177.23428: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867177.23431: Calling groups_plugins_play to load vars for managed_node3 12033 1726867177.25436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867177.29103: done with get_vars() 12033 1726867177.29126: done getting variables 12033 1726867177.29450: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:19:37 -0400 (0:00:00.100) 0:00:16.410 ****** 12033 1726867177.29491: entering _queue_task() for managed_node3/debug 12033 1726867177.29999: worker is 1 (out of 1 available) 12033 1726867177.30010: exiting _queue_task() for managed_node3/debug 12033 1726867177.30021: done queuing things up, now waiting for results queue to drain 12033 1726867177.30023: waiting for pending results... 12033 1726867177.30399: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12033 1726867177.30405: in run() - task 0affcac9-a3a5-74bb-502b-00000000028a 12033 1726867177.30409: variable 'ansible_search_path' from source: unknown 12033 1726867177.30412: variable 'ansible_search_path' from source: unknown 12033 1726867177.30415: calling self._execute() 12033 1726867177.30499: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867177.30504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867177.30514: variable 'omit' from source: magic vars 12033 1726867177.30941: variable 'ansible_distribution_major_version' from source: facts 12033 1726867177.30966: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867177.31003: variable 'omit' from source: magic vars 12033 1726867177.31095: variable 'omit' from source: magic vars 12033 1726867177.31245: variable 'omit' from source: magic vars 12033 1726867177.31286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867177.31434: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867177.31451: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867177.31470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867177.31482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867177.31507: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867177.31511: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867177.31513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867177.31723: Set connection var ansible_pipelining to False 12033 1726867177.31731: Set connection var ansible_shell_executable to /bin/sh 12033 1726867177.31738: Set connection var ansible_timeout to 10 12033 1726867177.31743: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867177.31846: Set connection var ansible_connection to ssh 12033 1726867177.31853: Set connection var ansible_shell_type to sh 12033 1726867177.31904: variable 'ansible_shell_executable' from source: unknown 12033 1726867177.31907: variable 'ansible_connection' from source: unknown 12033 1726867177.31911: variable 'ansible_module_compression' from source: unknown 12033 1726867177.31913: variable 'ansible_shell_type' from source: unknown 12033 1726867177.31915: variable 'ansible_shell_executable' from source: unknown 12033 1726867177.31917: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867177.31919: variable 'ansible_pipelining' from source: unknown 12033 1726867177.31921: variable 'ansible_timeout' from source: unknown 12033 1726867177.31923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867177.32184: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867177.32191: variable 'omit' from source: magic vars 12033 1726867177.32195: starting attempt loop 12033 1726867177.32198: running the handler 12033 1726867177.32256: variable '__network_connections_result' from source: set_fact 12033 1726867177.32684: variable '__network_connections_result' from source: set_fact 12033 1726867177.32687: handler run complete 12033 1726867177.32692: attempt loop complete, returning result 12033 1726867177.32695: _execute() done 12033 1726867177.32697: dumping result to json 12033 1726867177.32699: done dumping result, returning 12033 1726867177.32702: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-74bb-502b-00000000028a] 12033 1726867177.32704: sending task result for task 0affcac9-a3a5-74bb-502b-00000000028a 12033 1726867177.32773: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000028a 12033 1726867177.32779: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, de3a6889-a8bc-4195-81cc-ec6220008b92\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 014cff9e-2499-4f69-88d3-e2ba3869747a\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5369e4ef-1a37-4dbb-886a-05f2f96cb3c2\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, de3a6889-a8bc-4195-81cc-ec6220008b92 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 014cff9e-2499-4f69-88d3-e2ba3869747a (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5369e4ef-1a37-4dbb-886a-05f2f96cb3c2 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, de3a6889-a8bc-4195-81cc-ec6220008b92", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 014cff9e-2499-4f69-88d3-e2ba3869747a", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5369e4ef-1a37-4dbb-886a-05f2f96cb3c2", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, de3a6889-a8bc-4195-81cc-ec6220008b92 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 014cff9e-2499-4f69-88d3-e2ba3869747a (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5369e4ef-1a37-4dbb-886a-05f2f96cb3c2 (not-active)" ] } } 12033 1726867177.32901: no more pending results, returning what we have 12033 1726867177.32904: results queue empty 12033 1726867177.32906: checking for any_errors_fatal 12033 1726867177.32910: done checking for any_errors_fatal 12033 1726867177.32911: checking for max_fail_percentage 12033 1726867177.32914: done checking for max_fail_percentage 12033 1726867177.32914: checking to see if all hosts have failed and the running result is not ok 12033 1726867177.32915: done checking to see if all hosts have failed 12033 1726867177.32916: getting the remaining hosts for this loop 12033 1726867177.32917: done getting the remaining hosts for this loop 12033 1726867177.32921: getting the next task for host managed_node3 12033 1726867177.32928: done getting next task for host managed_node3 12033 1726867177.32931: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12033 1726867177.32936: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867177.32947: getting variables 12033 1726867177.32949: in VariableManager get_vars() 12033 1726867177.33092: Calling all_inventory to load vars for managed_node3 12033 1726867177.33096: Calling groups_inventory to load vars for managed_node3 12033 1726867177.33099: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867177.33109: Calling all_plugins_play to load vars for managed_node3 12033 1726867177.33112: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867177.33115: Calling groups_plugins_play to load vars for managed_node3 12033 1726867177.34522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867177.36089: done with get_vars() 12033 1726867177.36111: done getting variables 12033 1726867177.36213: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:19:37 -0400 (0:00:00.067) 0:00:16.478 ****** 12033 1726867177.36252: entering _queue_task() for managed_node3/debug 12033 1726867177.36674: worker is 1 (out of 1 available) 12033 1726867177.36852: exiting _queue_task() for managed_node3/debug 12033 1726867177.36863: done queuing things up, now waiting for results queue to drain 12033 1726867177.36865: waiting for pending results... 12033 1726867177.37064: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12033 1726867177.37214: in run() - task 0affcac9-a3a5-74bb-502b-00000000028b 12033 1726867177.37218: variable 'ansible_search_path' from source: unknown 12033 1726867177.37222: variable 'ansible_search_path' from source: unknown 12033 1726867177.37225: calling self._execute() 12033 1726867177.37321: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867177.37325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867177.37327: variable 'omit' from source: magic vars 12033 1726867177.37716: variable 'ansible_distribution_major_version' from source: facts 12033 1726867177.37821: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867177.37932: variable 'network_state' from source: role '' defaults 12033 1726867177.37936: Evaluated conditional (network_state != {}): False 12033 1726867177.37938: when evaluation is False, skipping this task 12033 1726867177.37940: _execute() done 12033 1726867177.37943: dumping result to json 12033 1726867177.37945: done dumping result, returning 12033 1726867177.37947: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-74bb-502b-00000000028b] 12033 1726867177.37949: sending task result for task 0affcac9-a3a5-74bb-502b-00000000028b 12033 1726867177.38103: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000028b 12033 1726867177.38107: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 12033 1726867177.38193: no more pending results, returning what we have 12033 1726867177.38197: results queue empty 12033 1726867177.38198: checking for any_errors_fatal 12033 1726867177.38209: done checking for any_errors_fatal 12033 1726867177.38209: checking for max_fail_percentage 12033 1726867177.38212: done checking for max_fail_percentage 12033 1726867177.38213: checking to see if all hosts have failed and the running result is not ok 12033 1726867177.38213: done checking to see if all hosts have failed 12033 1726867177.38214: getting the remaining hosts for this loop 12033 1726867177.38216: done getting the remaining hosts for this loop 12033 1726867177.38219: getting the next task for host managed_node3 12033 1726867177.38227: done getting next task for host managed_node3 12033 1726867177.38230: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12033 1726867177.38235: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867177.38251: getting variables 12033 1726867177.38253: in VariableManager get_vars() 12033 1726867177.38288: Calling all_inventory to load vars for managed_node3 12033 1726867177.38292: Calling groups_inventory to load vars for managed_node3 12033 1726867177.38295: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867177.38305: Calling all_plugins_play to load vars for managed_node3 12033 1726867177.38308: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867177.38311: Calling groups_plugins_play to load vars for managed_node3 12033 1726867177.39813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867177.41331: done with get_vars() 12033 1726867177.41350: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:19:37 -0400 (0:00:00.051) 0:00:16.530 ****** 12033 1726867177.41445: entering _queue_task() for managed_node3/ping 12033 1726867177.41447: Creating lock for ping 12033 1726867177.41730: worker is 1 (out of 1 available) 12033 1726867177.41740: exiting _queue_task() for managed_node3/ping 12033 1726867177.41752: done queuing things up, now waiting for results queue to drain 12033 1726867177.41753: waiting for pending results... 12033 1726867177.42039: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 12033 1726867177.42199: in run() - task 0affcac9-a3a5-74bb-502b-00000000028c 12033 1726867177.42256: variable 'ansible_search_path' from source: unknown 12033 1726867177.42259: variable 'ansible_search_path' from source: unknown 12033 1726867177.42288: calling self._execute() 12033 1726867177.42400: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867177.42583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867177.42586: variable 'omit' from source: magic vars 12033 1726867177.42826: variable 'ansible_distribution_major_version' from source: facts 12033 1726867177.42846: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867177.42858: variable 'omit' from source: magic vars 12033 1726867177.42929: variable 'omit' from source: magic vars 12033 1726867177.42971: variable 'omit' from source: magic vars 12033 1726867177.43028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867177.43068: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867177.43099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867177.43123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867177.43139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867177.43175: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867177.43187: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867177.43199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867177.43303: Set connection var ansible_pipelining to False 12033 1726867177.43318: Set connection var ansible_shell_executable to /bin/sh 12033 1726867177.43330: Set connection var ansible_timeout to 10 12033 1726867177.43339: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867177.43345: Set connection var ansible_connection to ssh 12033 1726867177.43354: Set connection var ansible_shell_type to sh 12033 1726867177.43583: variable 'ansible_shell_executable' from source: unknown 12033 1726867177.43586: variable 'ansible_connection' from source: unknown 12033 1726867177.43591: variable 'ansible_module_compression' from source: unknown 12033 1726867177.43594: variable 'ansible_shell_type' from source: unknown 12033 1726867177.43600: variable 'ansible_shell_executable' from source: unknown 12033 1726867177.43603: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867177.43605: variable 'ansible_pipelining' from source: unknown 12033 1726867177.43607: variable 'ansible_timeout' from source: unknown 12033 1726867177.43609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867177.43633: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867177.43664: variable 'omit' from source: magic vars 12033 1726867177.43673: starting attempt loop 12033 1726867177.43682: running the handler 12033 1726867177.43704: _low_level_execute_command(): starting 12033 1726867177.43717: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867177.44414: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867177.44425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867177.44441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867177.44494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867177.44557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867177.44614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867177.44655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867177.46413: stdout chunk (state=3): >>>/root <<< 12033 1726867177.46511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867177.46522: stdout chunk (state=3): >>><<< 12033 1726867177.46535: stderr chunk (state=3): >>><<< 12033 1726867177.46619: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867177.46639: _low_level_execute_command(): starting 12033 1726867177.46655: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867177.466254-12805-30461427903064 `" && echo ansible-tmp-1726867177.466254-12805-30461427903064="` echo /root/.ansible/tmp/ansible-tmp-1726867177.466254-12805-30461427903064 `" ) && sleep 0' 12033 1726867177.47538: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867177.47553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867177.47568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867177.47626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867177.47707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867177.47736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867177.47767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867177.47852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867177.49973: stdout chunk (state=3): >>>ansible-tmp-1726867177.466254-12805-30461427903064=/root/.ansible/tmp/ansible-tmp-1726867177.466254-12805-30461427903064 <<< 12033 1726867177.49980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867177.49983: stdout chunk (state=3): >>><<< 12033 1726867177.49985: stderr chunk (state=3): >>><<< 12033 1726867177.50105: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867177.466254-12805-30461427903064=/root/.ansible/tmp/ansible-tmp-1726867177.466254-12805-30461427903064 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867177.50109: variable 'ansible_module_compression' from source: unknown 12033 1726867177.50111: ANSIBALLZ: Using lock for ping 12033 1726867177.50114: ANSIBALLZ: Acquiring lock 12033 1726867177.50121: ANSIBALLZ: Lock acquired: 139897896744384 12033 1726867177.50183: ANSIBALLZ: Creating module 12033 1726867177.62318: ANSIBALLZ: Writing module into payload 12033 1726867177.62387: ANSIBALLZ: Writing module 12033 1726867177.62421: ANSIBALLZ: Renaming module 12033 1726867177.62441: ANSIBALLZ: Done creating module 12033 1726867177.62462: variable 'ansible_facts' from source: unknown 12033 1726867177.62549: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867177.466254-12805-30461427903064/AnsiballZ_ping.py 12033 1726867177.62775: Sending initial data 12033 1726867177.62782: Sent initial data (151 bytes) 12033 1726867177.63349: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867177.63359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867177.63370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867177.63387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867177.63482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867177.63486: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867177.63491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867177.63527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867177.63552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867177.63556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867177.63608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867177.65241: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867177.65298: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867177.65471: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmp5xqipdgu /root/.ansible/tmp/ansible-tmp-1726867177.466254-12805-30461427903064/AnsiballZ_ping.py <<< 12033 1726867177.65478: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867177.466254-12805-30461427903064/AnsiballZ_ping.py" <<< 12033 1726867177.65520: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmp5xqipdgu" to remote "/root/.ansible/tmp/ansible-tmp-1726867177.466254-12805-30461427903064/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867177.466254-12805-30461427903064/AnsiballZ_ping.py" <<< 12033 1726867177.66384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867177.66501: stderr chunk (state=3): >>><<< 12033 1726867177.66611: stdout chunk (state=3): >>><<< 12033 1726867177.66614: done transferring module to remote 12033 1726867177.66616: _low_level_execute_command(): starting 12033 1726867177.66619: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867177.466254-12805-30461427903064/ /root/.ansible/tmp/ansible-tmp-1726867177.466254-12805-30461427903064/AnsiballZ_ping.py && sleep 0' 12033 1726867177.67262: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867177.67270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867177.67337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867177.69239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867177.69242: stdout chunk (state=3): >>><<< 12033 1726867177.69244: stderr chunk (state=3): >>><<< 12033 1726867177.69246: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867177.69248: _low_level_execute_command(): starting 12033 1726867177.69250: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867177.466254-12805-30461427903064/AnsiballZ_ping.py && sleep 0' 12033 1726867177.69748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867177.69760: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867177.69772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867177.69799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867177.69816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867177.69901: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867177.69926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867177.70001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867177.85104: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12033 1726867177.86353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867177.86372: stdout chunk (state=3): >>><<< 12033 1726867177.86388: stderr chunk (state=3): >>><<< 12033 1726867177.86414: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867177.86443: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867177.466254-12805-30461427903064/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867177.86459: _low_level_execute_command(): starting 12033 1726867177.86481: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867177.466254-12805-30461427903064/ > /dev/null 2>&1 && sleep 0' 12033 1726867177.87202: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867177.87237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867177.87253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867177.87272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867177.87426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867177.89292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867177.89307: stdout chunk (state=3): >>><<< 12033 1726867177.89316: stderr chunk (state=3): >>><<< 12033 1726867177.89332: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867177.89345: handler run complete 12033 1726867177.89361: attempt loop complete, returning result 12033 1726867177.89367: _execute() done 12033 1726867177.89372: dumping result to json 12033 1726867177.89379: done dumping result, returning 12033 1726867177.89392: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-74bb-502b-00000000028c] 12033 1726867177.89402: sending task result for task 0affcac9-a3a5-74bb-502b-00000000028c ok: [managed_node3] => { "changed": false, "ping": "pong" } 12033 1726867177.89640: no more pending results, returning what we have 12033 1726867177.89644: results queue empty 12033 1726867177.89645: checking for any_errors_fatal 12033 1726867177.89650: done checking for any_errors_fatal 12033 1726867177.89651: checking for max_fail_percentage 12033 1726867177.89653: done checking for max_fail_percentage 12033 1726867177.89654: checking to see if all hosts have failed and the running result is not ok 12033 1726867177.89655: done checking to see if all hosts have failed 12033 1726867177.89655: getting the remaining hosts for this loop 12033 1726867177.89657: done getting the remaining hosts for this loop 12033 1726867177.89660: getting the next task for host managed_node3 12033 1726867177.89672: done getting next task for host managed_node3 12033 1726867177.89674: ^ task is: TASK: meta (role_complete) 12033 1726867177.89886: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867177.89902: getting variables 12033 1726867177.89904: in VariableManager get_vars() 12033 1726867177.89941: Calling all_inventory to load vars for managed_node3 12033 1726867177.89944: Calling groups_inventory to load vars for managed_node3 12033 1726867177.89947: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867177.89957: Calling all_plugins_play to load vars for managed_node3 12033 1726867177.89960: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867177.89963: Calling groups_plugins_play to load vars for managed_node3 12033 1726867177.90492: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000028c 12033 1726867177.90496: WORKER PROCESS EXITING 12033 1726867177.91560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867177.93200: done with get_vars() 12033 1726867177.93222: done getting variables 12033 1726867177.93304: done queuing things up, now waiting for results queue to drain 12033 1726867177.93313: results queue empty 12033 1726867177.93315: checking for any_errors_fatal 12033 1726867177.93317: done checking for any_errors_fatal 12033 1726867177.93318: checking for max_fail_percentage 12033 1726867177.93319: done checking for max_fail_percentage 12033 1726867177.93320: checking to see if all hosts have failed and the running result is not ok 12033 1726867177.93321: done checking to see if all hosts have failed 12033 1726867177.93322: getting the remaining hosts for this loop 12033 1726867177.93323: done getting the remaining hosts for this loop 12033 1726867177.93325: getting the next task for host managed_node3 12033 1726867177.93330: done getting next task for host managed_node3 12033 1726867177.93332: ^ task is: TASK: Show result 12033 1726867177.93335: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867177.93337: getting variables 12033 1726867177.93339: in VariableManager get_vars() 12033 1726867177.93349: Calling all_inventory to load vars for managed_node3 12033 1726867177.93351: Calling groups_inventory to load vars for managed_node3 12033 1726867177.93353: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867177.93358: Calling all_plugins_play to load vars for managed_node3 12033 1726867177.93360: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867177.93363: Calling groups_plugins_play to load vars for managed_node3 12033 1726867177.94453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867177.95914: done with get_vars() 12033 1726867177.95933: done getting variables 12033 1726867177.95975: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml:46 Friday 20 September 2024 17:19:37 -0400 (0:00:00.545) 0:00:17.075 ****** 12033 1726867177.96006: entering _queue_task() for managed_node3/debug 12033 1726867177.96312: worker is 1 (out of 1 available) 12033 1726867177.96323: exiting _queue_task() for managed_node3/debug 12033 1726867177.96334: done queuing things up, now waiting for results queue to drain 12033 1726867177.96335: waiting for pending results... 12033 1726867177.96608: running TaskExecutor() for managed_node3/TASK: Show result 12033 1726867177.96703: in run() - task 0affcac9-a3a5-74bb-502b-0000000001c6 12033 1726867177.96718: variable 'ansible_search_path' from source: unknown 12033 1726867177.96722: variable 'ansible_search_path' from source: unknown 12033 1726867177.96757: calling self._execute() 12033 1726867177.96849: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867177.96855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867177.96865: variable 'omit' from source: magic vars 12033 1726867177.97246: variable 'ansible_distribution_major_version' from source: facts 12033 1726867177.97258: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867177.97266: variable 'omit' from source: magic vars 12033 1726867177.97312: variable 'omit' from source: magic vars 12033 1726867177.97353: variable 'omit' from source: magic vars 12033 1726867177.97395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867177.97432: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867177.97450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867177.97475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867177.97494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867177.97524: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867177.97527: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867177.97529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867177.97632: Set connection var ansible_pipelining to False 12033 1726867177.97641: Set connection var ansible_shell_executable to /bin/sh 12033 1726867177.97649: Set connection var ansible_timeout to 10 12033 1726867177.97654: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867177.97657: Set connection var ansible_connection to ssh 12033 1726867177.97662: Set connection var ansible_shell_type to sh 12033 1726867177.97693: variable 'ansible_shell_executable' from source: unknown 12033 1726867177.97697: variable 'ansible_connection' from source: unknown 12033 1726867177.97700: variable 'ansible_module_compression' from source: unknown 12033 1726867177.97706: variable 'ansible_shell_type' from source: unknown 12033 1726867177.97709: variable 'ansible_shell_executable' from source: unknown 12033 1726867177.97711: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867177.97713: variable 'ansible_pipelining' from source: unknown 12033 1726867177.97715: variable 'ansible_timeout' from source: unknown 12033 1726867177.97717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867177.97905: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867177.97909: variable 'omit' from source: magic vars 12033 1726867177.97913: starting attempt loop 12033 1726867177.97915: running the handler 12033 1726867177.97925: variable '__network_connections_result' from source: set_fact 12033 1726867177.98004: variable '__network_connections_result' from source: set_fact 12033 1726867177.98228: handler run complete 12033 1726867177.98261: attempt loop complete, returning result 12033 1726867177.98264: _execute() done 12033 1726867177.98267: dumping result to json 12033 1726867177.98359: done dumping result, returning 12033 1726867177.98362: done running TaskExecutor() for managed_node3/TASK: Show result [0affcac9-a3a5-74bb-502b-0000000001c6] 12033 1726867177.98365: sending task result for task 0affcac9-a3a5-74bb-502b-0000000001c6 12033 1726867177.98434: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000001c6 12033 1726867177.98438: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, de3a6889-a8bc-4195-81cc-ec6220008b92\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 014cff9e-2499-4f69-88d3-e2ba3869747a\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5369e4ef-1a37-4dbb-886a-05f2f96cb3c2\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, de3a6889-a8bc-4195-81cc-ec6220008b92 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 014cff9e-2499-4f69-88d3-e2ba3869747a (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5369e4ef-1a37-4dbb-886a-05f2f96cb3c2 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, de3a6889-a8bc-4195-81cc-ec6220008b92", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 014cff9e-2499-4f69-88d3-e2ba3869747a", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5369e4ef-1a37-4dbb-886a-05f2f96cb3c2", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, de3a6889-a8bc-4195-81cc-ec6220008b92 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 014cff9e-2499-4f69-88d3-e2ba3869747a (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5369e4ef-1a37-4dbb-886a-05f2f96cb3c2 (not-active)" ] } } 12033 1726867177.98551: no more pending results, returning what we have 12033 1726867177.98556: results queue empty 12033 1726867177.98557: checking for any_errors_fatal 12033 1726867177.98559: done checking for any_errors_fatal 12033 1726867177.98560: checking for max_fail_percentage 12033 1726867177.98562: done checking for max_fail_percentage 12033 1726867177.98563: checking to see if all hosts have failed and the running result is not ok 12033 1726867177.98563: done checking to see if all hosts have failed 12033 1726867177.98564: getting the remaining hosts for this loop 12033 1726867177.98566: done getting the remaining hosts for this loop 12033 1726867177.98570: getting the next task for host managed_node3 12033 1726867177.98580: done getting next task for host managed_node3 12033 1726867177.98584: ^ task is: TASK: Asserts 12033 1726867177.98588: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867177.98593: getting variables 12033 1726867177.98595: in VariableManager get_vars() 12033 1726867177.98625: Calling all_inventory to load vars for managed_node3 12033 1726867177.98628: Calling groups_inventory to load vars for managed_node3 12033 1726867177.98632: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867177.98644: Calling all_plugins_play to load vars for managed_node3 12033 1726867177.98647: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867177.98649: Calling groups_plugins_play to load vars for managed_node3 12033 1726867178.00185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867178.01633: done with get_vars() 12033 1726867178.01653: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 17:19:38 -0400 (0:00:00.057) 0:00:17.133 ****** 12033 1726867178.01745: entering _queue_task() for managed_node3/include_tasks 12033 1726867178.02009: worker is 1 (out of 1 available) 12033 1726867178.02020: exiting _queue_task() for managed_node3/include_tasks 12033 1726867178.02031: done queuing things up, now waiting for results queue to drain 12033 1726867178.02032: waiting for pending results... 12033 1726867178.02397: running TaskExecutor() for managed_node3/TASK: Asserts 12033 1726867178.02407: in run() - task 0affcac9-a3a5-74bb-502b-00000000008d 12033 1726867178.02414: variable 'ansible_search_path' from source: unknown 12033 1726867178.02418: variable 'ansible_search_path' from source: unknown 12033 1726867178.02495: variable 'lsr_assert' from source: include params 12033 1726867178.02666: variable 'lsr_assert' from source: include params 12033 1726867178.02737: variable 'omit' from source: magic vars 12033 1726867178.02976: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.02983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.02985: variable 'omit' from source: magic vars 12033 1726867178.03200: variable 'ansible_distribution_major_version' from source: facts 12033 1726867178.03215: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867178.03226: variable 'item' from source: unknown 12033 1726867178.03307: variable 'item' from source: unknown 12033 1726867178.03344: variable 'item' from source: unknown 12033 1726867178.03419: variable 'item' from source: unknown 12033 1726867178.03684: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.03692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.03696: variable 'omit' from source: magic vars 12033 1726867178.03826: variable 'ansible_distribution_major_version' from source: facts 12033 1726867178.03838: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867178.03848: variable 'item' from source: unknown 12033 1726867178.03924: variable 'item' from source: unknown 12033 1726867178.03958: variable 'item' from source: unknown 12033 1726867178.04032: variable 'item' from source: unknown 12033 1726867178.04249: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.04253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.04256: variable 'omit' from source: magic vars 12033 1726867178.04372: variable 'ansible_distribution_major_version' from source: facts 12033 1726867178.04375: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867178.04450: variable 'item' from source: unknown 12033 1726867178.04454: variable 'item' from source: unknown 12033 1726867178.04496: variable 'item' from source: unknown 12033 1726867178.04564: variable 'item' from source: unknown 12033 1726867178.04668: dumping result to json 12033 1726867178.04672: done dumping result, returning 12033 1726867178.04675: done running TaskExecutor() for managed_node3/TASK: Asserts [0affcac9-a3a5-74bb-502b-00000000008d] 12033 1726867178.04699: sending task result for task 0affcac9-a3a5-74bb-502b-00000000008d 12033 1726867178.04833: no more pending results, returning what we have 12033 1726867178.04838: in VariableManager get_vars() 12033 1726867178.04872: Calling all_inventory to load vars for managed_node3 12033 1726867178.04875: Calling groups_inventory to load vars for managed_node3 12033 1726867178.04880: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867178.04895: Calling all_plugins_play to load vars for managed_node3 12033 1726867178.04898: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867178.04901: Calling groups_plugins_play to load vars for managed_node3 12033 1726867178.05420: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000008d 12033 1726867178.05427: WORKER PROCESS EXITING 12033 1726867178.06303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867178.08086: done with get_vars() 12033 1726867178.08106: variable 'ansible_search_path' from source: unknown 12033 1726867178.08108: variable 'ansible_search_path' from source: unknown 12033 1726867178.08149: variable 'ansible_search_path' from source: unknown 12033 1726867178.08150: variable 'ansible_search_path' from source: unknown 12033 1726867178.08182: variable 'ansible_search_path' from source: unknown 12033 1726867178.08183: variable 'ansible_search_path' from source: unknown 12033 1726867178.08211: we have included files to process 12033 1726867178.08212: generating all_blocks data 12033 1726867178.08214: done generating all_blocks data 12033 1726867178.08219: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 12033 1726867178.08220: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 12033 1726867178.08222: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 12033 1726867178.08372: in VariableManager get_vars() 12033 1726867178.08394: done with get_vars() 12033 1726867178.08400: variable 'item' from source: include params 12033 1726867178.08504: variable 'item' from source: include params 12033 1726867178.08535: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12033 1726867178.08618: in VariableManager get_vars() 12033 1726867178.08637: done with get_vars() 12033 1726867178.08761: done processing included file 12033 1726867178.08763: iterating over new_blocks loaded from include file 12033 1726867178.08764: in VariableManager get_vars() 12033 1726867178.08779: done with get_vars() 12033 1726867178.08781: filtering new block on tags 12033 1726867178.08827: done filtering new block on tags 12033 1726867178.08830: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml for managed_node3 => (item=tasks/assert_controller_device_present.yml) 12033 1726867178.08836: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 12033 1726867178.08837: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 12033 1726867178.08840: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 12033 1726867178.08961: in VariableManager get_vars() 12033 1726867178.08980: done with get_vars() 12033 1726867178.08992: done processing included file 12033 1726867178.08994: iterating over new_blocks loaded from include file 12033 1726867178.08995: in VariableManager get_vars() 12033 1726867178.09007: done with get_vars() 12033 1726867178.09009: filtering new block on tags 12033 1726867178.09030: done filtering new block on tags 12033 1726867178.09032: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml for managed_node3 => (item=tasks/assert_bond_port_profile_present.yml) 12033 1726867178.09035: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 12033 1726867178.09036: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 12033 1726867178.09043: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 12033 1726867178.09345: in VariableManager get_vars() 12033 1726867178.09362: done with get_vars() 12033 1726867178.09404: in VariableManager get_vars() 12033 1726867178.09420: done with get_vars() 12033 1726867178.09432: done processing included file 12033 1726867178.09433: iterating over new_blocks loaded from include file 12033 1726867178.09434: in VariableManager get_vars() 12033 1726867178.09446: done with get_vars() 12033 1726867178.09447: filtering new block on tags 12033 1726867178.09490: done filtering new block on tags 12033 1726867178.09492: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml for managed_node3 => (item=tasks/assert_bond_options.yml) 12033 1726867178.09495: extending task lists for all hosts with included blocks 12033 1726867178.10880: done extending task lists 12033 1726867178.10881: done processing included files 12033 1726867178.10882: results queue empty 12033 1726867178.10883: checking for any_errors_fatal 12033 1726867178.10889: done checking for any_errors_fatal 12033 1726867178.10890: checking for max_fail_percentage 12033 1726867178.10891: done checking for max_fail_percentage 12033 1726867178.10892: checking to see if all hosts have failed and the running result is not ok 12033 1726867178.10893: done checking to see if all hosts have failed 12033 1726867178.10893: getting the remaining hosts for this loop 12033 1726867178.10895: done getting the remaining hosts for this loop 12033 1726867178.10897: getting the next task for host managed_node3 12033 1726867178.10902: done getting next task for host managed_node3 12033 1726867178.10904: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12033 1726867178.10907: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867178.10909: getting variables 12033 1726867178.10910: in VariableManager get_vars() 12033 1726867178.10919: Calling all_inventory to load vars for managed_node3 12033 1726867178.10921: Calling groups_inventory to load vars for managed_node3 12033 1726867178.10923: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867178.10929: Calling all_plugins_play to load vars for managed_node3 12033 1726867178.10931: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867178.10934: Calling groups_plugins_play to load vars for managed_node3 12033 1726867178.11990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867178.13457: done with get_vars() 12033 1726867178.13481: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:19:38 -0400 (0:00:00.118) 0:00:17.251 ****** 12033 1726867178.13558: entering _queue_task() for managed_node3/include_tasks 12033 1726867178.13886: worker is 1 (out of 1 available) 12033 1726867178.13898: exiting _queue_task() for managed_node3/include_tasks 12033 1726867178.13910: done queuing things up, now waiting for results queue to drain 12033 1726867178.13912: waiting for pending results... 12033 1726867178.14210: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 12033 1726867178.14409: in run() - task 0affcac9-a3a5-74bb-502b-0000000003f5 12033 1726867178.14413: variable 'ansible_search_path' from source: unknown 12033 1726867178.14416: variable 'ansible_search_path' from source: unknown 12033 1726867178.14419: calling self._execute() 12033 1726867178.14465: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.14471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.14481: variable 'omit' from source: magic vars 12033 1726867178.14853: variable 'ansible_distribution_major_version' from source: facts 12033 1726867178.14865: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867178.14870: _execute() done 12033 1726867178.14874: dumping result to json 12033 1726867178.14876: done dumping result, returning 12033 1726867178.14950: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-74bb-502b-0000000003f5] 12033 1726867178.14954: sending task result for task 0affcac9-a3a5-74bb-502b-0000000003f5 12033 1726867178.15020: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000003f5 12033 1726867178.15022: WORKER PROCESS EXITING 12033 1726867178.15069: no more pending results, returning what we have 12033 1726867178.15075: in VariableManager get_vars() 12033 1726867178.15117: Calling all_inventory to load vars for managed_node3 12033 1726867178.15120: Calling groups_inventory to load vars for managed_node3 12033 1726867178.15124: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867178.15138: Calling all_plugins_play to load vars for managed_node3 12033 1726867178.15141: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867178.15144: Calling groups_plugins_play to load vars for managed_node3 12033 1726867178.16634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867178.22881: done with get_vars() 12033 1726867178.22902: variable 'ansible_search_path' from source: unknown 12033 1726867178.22903: variable 'ansible_search_path' from source: unknown 12033 1726867178.22938: we have included files to process 12033 1726867178.22940: generating all_blocks data 12033 1726867178.22941: done generating all_blocks data 12033 1726867178.22942: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12033 1726867178.22943: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12033 1726867178.22945: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12033 1726867178.23112: done processing included file 12033 1726867178.23114: iterating over new_blocks loaded from include file 12033 1726867178.23116: in VariableManager get_vars() 12033 1726867178.23132: done with get_vars() 12033 1726867178.23134: filtering new block on tags 12033 1726867178.23161: done filtering new block on tags 12033 1726867178.23164: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 12033 1726867178.23168: extending task lists for all hosts with included blocks 12033 1726867178.23375: done extending task lists 12033 1726867178.23376: done processing included files 12033 1726867178.23379: results queue empty 12033 1726867178.23379: checking for any_errors_fatal 12033 1726867178.23382: done checking for any_errors_fatal 12033 1726867178.23383: checking for max_fail_percentage 12033 1726867178.23384: done checking for max_fail_percentage 12033 1726867178.23384: checking to see if all hosts have failed and the running result is not ok 12033 1726867178.23385: done checking to see if all hosts have failed 12033 1726867178.23386: getting the remaining hosts for this loop 12033 1726867178.23387: done getting the remaining hosts for this loop 12033 1726867178.23390: getting the next task for host managed_node3 12033 1726867178.23394: done getting next task for host managed_node3 12033 1726867178.23396: ^ task is: TASK: Get stat for interface {{ interface }} 12033 1726867178.23399: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867178.23402: getting variables 12033 1726867178.23403: in VariableManager get_vars() 12033 1726867178.23412: Calling all_inventory to load vars for managed_node3 12033 1726867178.23414: Calling groups_inventory to load vars for managed_node3 12033 1726867178.23416: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867178.23422: Calling all_plugins_play to load vars for managed_node3 12033 1726867178.23424: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867178.23427: Calling groups_plugins_play to load vars for managed_node3 12033 1726867178.24508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867178.26192: done with get_vars() 12033 1726867178.26212: done getting variables 12033 1726867178.26356: variable 'interface' from source: task vars 12033 1726867178.26360: variable 'controller_device' from source: play vars 12033 1726867178.26418: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:19:38 -0400 (0:00:00.128) 0:00:17.380 ****** 12033 1726867178.26449: entering _queue_task() for managed_node3/stat 12033 1726867178.26804: worker is 1 (out of 1 available) 12033 1726867178.26817: exiting _queue_task() for managed_node3/stat 12033 1726867178.26829: done queuing things up, now waiting for results queue to drain 12033 1726867178.26831: waiting for pending results... 12033 1726867178.27125: running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond 12033 1726867178.27257: in run() - task 0affcac9-a3a5-74bb-502b-0000000004af 12033 1726867178.27269: variable 'ansible_search_path' from source: unknown 12033 1726867178.27272: variable 'ansible_search_path' from source: unknown 12033 1726867178.27368: calling self._execute() 12033 1726867178.27433: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.27440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.27443: variable 'omit' from source: magic vars 12033 1726867178.27991: variable 'ansible_distribution_major_version' from source: facts 12033 1726867178.27995: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867178.27998: variable 'omit' from source: magic vars 12033 1726867178.28000: variable 'omit' from source: magic vars 12033 1726867178.28002: variable 'interface' from source: task vars 12033 1726867178.28005: variable 'controller_device' from source: play vars 12033 1726867178.28099: variable 'controller_device' from source: play vars 12033 1726867178.28102: variable 'omit' from source: magic vars 12033 1726867178.28105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867178.28207: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867178.28211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867178.28214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867178.28216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867178.28219: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867178.28221: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.28223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.28326: Set connection var ansible_pipelining to False 12033 1726867178.28330: Set connection var ansible_shell_executable to /bin/sh 12033 1726867178.28343: Set connection var ansible_timeout to 10 12033 1726867178.28346: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867178.28349: Set connection var ansible_connection to ssh 12033 1726867178.28351: Set connection var ansible_shell_type to sh 12033 1726867178.28370: variable 'ansible_shell_executable' from source: unknown 12033 1726867178.28373: variable 'ansible_connection' from source: unknown 12033 1726867178.28375: variable 'ansible_module_compression' from source: unknown 12033 1726867178.28380: variable 'ansible_shell_type' from source: unknown 12033 1726867178.28383: variable 'ansible_shell_executable' from source: unknown 12033 1726867178.28385: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.28388: variable 'ansible_pipelining' from source: unknown 12033 1726867178.28394: variable 'ansible_timeout' from source: unknown 12033 1726867178.28402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.28595: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867178.28642: variable 'omit' from source: magic vars 12033 1726867178.28645: starting attempt loop 12033 1726867178.28648: running the handler 12033 1726867178.28650: _low_level_execute_command(): starting 12033 1726867178.28652: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867178.29408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867178.29458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867178.29473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867178.29485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867178.29564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867178.31257: stdout chunk (state=3): >>>/root <<< 12033 1726867178.31408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867178.31412: stdout chunk (state=3): >>><<< 12033 1726867178.31414: stderr chunk (state=3): >>><<< 12033 1726867178.31524: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867178.31528: _low_level_execute_command(): starting 12033 1726867178.31531: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867178.3144119-12848-73720396690920 `" && echo ansible-tmp-1726867178.3144119-12848-73720396690920="` echo /root/.ansible/tmp/ansible-tmp-1726867178.3144119-12848-73720396690920 `" ) && sleep 0' 12033 1726867178.32095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867178.32194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867178.32230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867178.32249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867178.32332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867178.34265: stdout chunk (state=3): >>>ansible-tmp-1726867178.3144119-12848-73720396690920=/root/.ansible/tmp/ansible-tmp-1726867178.3144119-12848-73720396690920 <<< 12033 1726867178.34415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867178.34427: stdout chunk (state=3): >>><<< 12033 1726867178.34445: stderr chunk (state=3): >>><<< 12033 1726867178.34470: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867178.3144119-12848-73720396690920=/root/.ansible/tmp/ansible-tmp-1726867178.3144119-12848-73720396690920 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867178.34687: variable 'ansible_module_compression' from source: unknown 12033 1726867178.34693: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12033 1726867178.34696: variable 'ansible_facts' from source: unknown 12033 1726867178.34731: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867178.3144119-12848-73720396690920/AnsiballZ_stat.py 12033 1726867178.34928: Sending initial data 12033 1726867178.34937: Sent initial data (152 bytes) 12033 1726867178.35485: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867178.35494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867178.35504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867178.35517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867178.35593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867178.35613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867178.35625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867178.35633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867178.35704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867178.37303: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12033 1726867178.37331: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867178.37382: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867178.37460: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpaek94w37 /root/.ansible/tmp/ansible-tmp-1726867178.3144119-12848-73720396690920/AnsiballZ_stat.py <<< 12033 1726867178.37484: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867178.3144119-12848-73720396690920/AnsiballZ_stat.py" <<< 12033 1726867178.37507: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 12033 1726867178.37536: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpaek94w37" to remote "/root/.ansible/tmp/ansible-tmp-1726867178.3144119-12848-73720396690920/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867178.3144119-12848-73720396690920/AnsiballZ_stat.py" <<< 12033 1726867178.38404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867178.38419: stderr chunk (state=3): >>><<< 12033 1726867178.38442: stdout chunk (state=3): >>><<< 12033 1726867178.38521: done transferring module to remote 12033 1726867178.38524: _low_level_execute_command(): starting 12033 1726867178.38534: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867178.3144119-12848-73720396690920/ /root/.ansible/tmp/ansible-tmp-1726867178.3144119-12848-73720396690920/AnsiballZ_stat.py && sleep 0' 12033 1726867178.39220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867178.39282: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867178.39298: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867178.39344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867178.39358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867178.39379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867178.39452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867178.41301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867178.41305: stdout chunk (state=3): >>><<< 12033 1726867178.41411: stderr chunk (state=3): >>><<< 12033 1726867178.41415: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867178.41418: _low_level_execute_command(): starting 12033 1726867178.41420: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867178.3144119-12848-73720396690920/AnsiballZ_stat.py && sleep 0' 12033 1726867178.41974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867178.41996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867178.42010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867178.42122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867178.42154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867178.42245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867178.57398: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28290, "dev": 23, "nlink": 1, "atime": 1726867176.7908053, "mtime": 1726867176.7908053, "ctime": 1726867176.7908053, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12033 1726867178.58796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867178.58799: stdout chunk (state=3): >>><<< 12033 1726867178.58802: stderr chunk (state=3): >>><<< 12033 1726867178.58805: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28290, "dev": 23, "nlink": 1, "atime": 1726867176.7908053, "mtime": 1726867176.7908053, "ctime": 1726867176.7908053, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867178.58829: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867178.3144119-12848-73720396690920/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867178.58840: _low_level_execute_command(): starting 12033 1726867178.58845: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867178.3144119-12848-73720396690920/ > /dev/null 2>&1 && sleep 0' 12033 1726867178.59530: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867178.59549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867178.59561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867178.59575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867178.59664: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867178.59689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867178.59704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867178.59745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867178.59800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867178.61853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867178.61857: stdout chunk (state=3): >>><<< 12033 1726867178.61859: stderr chunk (state=3): >>><<< 12033 1726867178.61876: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867178.62082: handler run complete 12033 1726867178.62086: attempt loop complete, returning result 12033 1726867178.62095: _execute() done 12033 1726867178.62098: dumping result to json 12033 1726867178.62100: done dumping result, returning 12033 1726867178.62102: done running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond [0affcac9-a3a5-74bb-502b-0000000004af] 12033 1726867178.62104: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004af 12033 1726867178.62182: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004af 12033 1726867178.62185: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726867176.7908053, "block_size": 4096, "blocks": 0, "ctime": 1726867176.7908053, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28290, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1726867176.7908053, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12033 1726867178.62281: no more pending results, returning what we have 12033 1726867178.62285: results queue empty 12033 1726867178.62286: checking for any_errors_fatal 12033 1726867178.62288: done checking for any_errors_fatal 12033 1726867178.62288: checking for max_fail_percentage 12033 1726867178.62290: done checking for max_fail_percentage 12033 1726867178.62291: checking to see if all hosts have failed and the running result is not ok 12033 1726867178.62291: done checking to see if all hosts have failed 12033 1726867178.62292: getting the remaining hosts for this loop 12033 1726867178.62294: done getting the remaining hosts for this loop 12033 1726867178.62298: getting the next task for host managed_node3 12033 1726867178.62307: done getting next task for host managed_node3 12033 1726867178.62309: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12033 1726867178.62314: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867178.62318: getting variables 12033 1726867178.62320: in VariableManager get_vars() 12033 1726867178.62349: Calling all_inventory to load vars for managed_node3 12033 1726867178.62352: Calling groups_inventory to load vars for managed_node3 12033 1726867178.62355: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867178.62365: Calling all_plugins_play to load vars for managed_node3 12033 1726867178.62367: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867178.62370: Calling groups_plugins_play to load vars for managed_node3 12033 1726867178.63986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867178.65602: done with get_vars() 12033 1726867178.65621: done getting variables 12033 1726867178.65680: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867178.65818: variable 'interface' from source: task vars 12033 1726867178.65823: variable 'controller_device' from source: play vars 12033 1726867178.65892: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:19:38 -0400 (0:00:00.394) 0:00:17.775 ****** 12033 1726867178.65931: entering _queue_task() for managed_node3/assert 12033 1726867178.66274: worker is 1 (out of 1 available) 12033 1726867178.66393: exiting _queue_task() for managed_node3/assert 12033 1726867178.66402: done queuing things up, now waiting for results queue to drain 12033 1726867178.66404: waiting for pending results... 12033 1726867178.66613: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' 12033 1726867178.66780: in run() - task 0affcac9-a3a5-74bb-502b-0000000003f6 12033 1726867178.66804: variable 'ansible_search_path' from source: unknown 12033 1726867178.66813: variable 'ansible_search_path' from source: unknown 12033 1726867178.66865: calling self._execute() 12033 1726867178.66970: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.66985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.67004: variable 'omit' from source: magic vars 12033 1726867178.67365: variable 'ansible_distribution_major_version' from source: facts 12033 1726867178.67393: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867178.67483: variable 'omit' from source: magic vars 12033 1726867178.67486: variable 'omit' from source: magic vars 12033 1726867178.67579: variable 'interface' from source: task vars 12033 1726867178.67593: variable 'controller_device' from source: play vars 12033 1726867178.67666: variable 'controller_device' from source: play vars 12033 1726867178.67696: variable 'omit' from source: magic vars 12033 1726867178.67746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867178.67793: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867178.67819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867178.67847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867178.67882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867178.67899: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867178.67907: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.67913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.68028: Set connection var ansible_pipelining to False 12033 1726867178.68083: Set connection var ansible_shell_executable to /bin/sh 12033 1726867178.68087: Set connection var ansible_timeout to 10 12033 1726867178.68092: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867178.68095: Set connection var ansible_connection to ssh 12033 1726867178.68097: Set connection var ansible_shell_type to sh 12033 1726867178.68114: variable 'ansible_shell_executable' from source: unknown 12033 1726867178.68122: variable 'ansible_connection' from source: unknown 12033 1726867178.68284: variable 'ansible_module_compression' from source: unknown 12033 1726867178.68288: variable 'ansible_shell_type' from source: unknown 12033 1726867178.68293: variable 'ansible_shell_executable' from source: unknown 12033 1726867178.68295: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.68298: variable 'ansible_pipelining' from source: unknown 12033 1726867178.68301: variable 'ansible_timeout' from source: unknown 12033 1726867178.68303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.68320: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867178.68339: variable 'omit' from source: magic vars 12033 1726867178.68348: starting attempt loop 12033 1726867178.68355: running the handler 12033 1726867178.68503: variable 'interface_stat' from source: set_fact 12033 1726867178.68535: Evaluated conditional (interface_stat.stat.exists): True 12033 1726867178.68545: handler run complete 12033 1726867178.68563: attempt loop complete, returning result 12033 1726867178.68570: _execute() done 12033 1726867178.68579: dumping result to json 12033 1726867178.68639: done dumping result, returning 12033 1726867178.68642: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' [0affcac9-a3a5-74bb-502b-0000000003f6] 12033 1726867178.68645: sending task result for task 0affcac9-a3a5-74bb-502b-0000000003f6 12033 1726867178.68719: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000003f6 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12033 1726867178.68796: no more pending results, returning what we have 12033 1726867178.68800: results queue empty 12033 1726867178.68802: checking for any_errors_fatal 12033 1726867178.68809: done checking for any_errors_fatal 12033 1726867178.68810: checking for max_fail_percentage 12033 1726867178.68812: done checking for max_fail_percentage 12033 1726867178.68813: checking to see if all hosts have failed and the running result is not ok 12033 1726867178.68814: done checking to see if all hosts have failed 12033 1726867178.68814: getting the remaining hosts for this loop 12033 1726867178.68816: done getting the remaining hosts for this loop 12033 1726867178.68820: getting the next task for host managed_node3 12033 1726867178.68833: done getting next task for host managed_node3 12033 1726867178.68837: ^ task is: TASK: Include the task 'assert_profile_present.yml' 12033 1726867178.68842: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867178.68847: getting variables 12033 1726867178.68850: in VariableManager get_vars() 12033 1726867178.68887: Calling all_inventory to load vars for managed_node3 12033 1726867178.68893: Calling groups_inventory to load vars for managed_node3 12033 1726867178.68897: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867178.68909: Calling all_plugins_play to load vars for managed_node3 12033 1726867178.68913: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867178.68916: Calling groups_plugins_play to load vars for managed_node3 12033 1726867178.69620: WORKER PROCESS EXITING 12033 1726867178.70618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867178.72218: done with get_vars() 12033 1726867178.72245: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml:3 Friday 20 September 2024 17:19:38 -0400 (0:00:00.064) 0:00:17.839 ****** 12033 1726867178.72354: entering _queue_task() for managed_node3/include_tasks 12033 1726867178.72807: worker is 1 (out of 1 available) 12033 1726867178.72819: exiting _queue_task() for managed_node3/include_tasks 12033 1726867178.72830: done queuing things up, now waiting for results queue to drain 12033 1726867178.72831: waiting for pending results... 12033 1726867178.73035: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 12033 1726867178.73155: in run() - task 0affcac9-a3a5-74bb-502b-0000000003fb 12033 1726867178.73184: variable 'ansible_search_path' from source: unknown 12033 1726867178.73196: variable 'ansible_search_path' from source: unknown 12033 1726867178.73251: variable 'controller_profile' from source: play vars 12033 1726867178.73471: variable 'controller_profile' from source: play vars 12033 1726867178.73502: variable 'port1_profile' from source: play vars 12033 1726867178.73583: variable 'port1_profile' from source: play vars 12033 1726867178.73605: variable 'port2_profile' from source: play vars 12033 1726867178.73683: variable 'port2_profile' from source: play vars 12033 1726867178.73707: variable 'omit' from source: magic vars 12033 1726867178.73871: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.73893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.73911: variable 'omit' from source: magic vars 12033 1726867178.74185: variable 'ansible_distribution_major_version' from source: facts 12033 1726867178.74208: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867178.74245: variable 'bond_port_profile' from source: unknown 12033 1726867178.74366: variable 'bond_port_profile' from source: unknown 12033 1726867178.74656: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.74659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.74662: variable 'omit' from source: magic vars 12033 1726867178.74781: variable 'ansible_distribution_major_version' from source: facts 12033 1726867178.74785: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867178.74788: variable 'bond_port_profile' from source: unknown 12033 1726867178.74850: variable 'bond_port_profile' from source: unknown 12033 1726867178.75083: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.75086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.75092: variable 'omit' from source: magic vars 12033 1726867178.75173: variable 'ansible_distribution_major_version' from source: facts 12033 1726867178.75187: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867178.75230: variable 'bond_port_profile' from source: unknown 12033 1726867178.75298: variable 'bond_port_profile' from source: unknown 12033 1726867178.75537: dumping result to json 12033 1726867178.75541: done dumping result, returning 12033 1726867178.75544: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [0affcac9-a3a5-74bb-502b-0000000003fb] 12033 1726867178.75546: sending task result for task 0affcac9-a3a5-74bb-502b-0000000003fb 12033 1726867178.75592: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000003fb 12033 1726867178.75596: WORKER PROCESS EXITING 12033 1726867178.75667: no more pending results, returning what we have 12033 1726867178.75672: in VariableManager get_vars() 12033 1726867178.75716: Calling all_inventory to load vars for managed_node3 12033 1726867178.75720: Calling groups_inventory to load vars for managed_node3 12033 1726867178.75724: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867178.75738: Calling all_plugins_play to load vars for managed_node3 12033 1726867178.75741: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867178.75745: Calling groups_plugins_play to load vars for managed_node3 12033 1726867178.77334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867178.78918: done with get_vars() 12033 1726867178.78936: variable 'ansible_search_path' from source: unknown 12033 1726867178.78937: variable 'ansible_search_path' from source: unknown 12033 1726867178.78946: variable 'item' from source: include params 12033 1726867178.79053: variable 'item' from source: include params 12033 1726867178.79097: variable 'ansible_search_path' from source: unknown 12033 1726867178.79098: variable 'ansible_search_path' from source: unknown 12033 1726867178.79105: variable 'item' from source: include params 12033 1726867178.79170: variable 'item' from source: include params 12033 1726867178.79205: variable 'ansible_search_path' from source: unknown 12033 1726867178.79207: variable 'ansible_search_path' from source: unknown 12033 1726867178.79212: variable 'item' from source: include params 12033 1726867178.79271: variable 'item' from source: include params 12033 1726867178.79303: we have included files to process 12033 1726867178.79304: generating all_blocks data 12033 1726867178.79306: done generating all_blocks data 12033 1726867178.79310: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12033 1726867178.79311: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12033 1726867178.79313: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12033 1726867178.79510: in VariableManager get_vars() 12033 1726867178.79531: done with get_vars() 12033 1726867178.79821: done processing included file 12033 1726867178.79823: iterating over new_blocks loaded from include file 12033 1726867178.79824: in VariableManager get_vars() 12033 1726867178.79838: done with get_vars() 12033 1726867178.79840: filtering new block on tags 12033 1726867178.79909: done filtering new block on tags 12033 1726867178.79912: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0) 12033 1726867178.79917: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12033 1726867178.79918: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12033 1726867178.79921: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12033 1726867178.80029: in VariableManager get_vars() 12033 1726867178.80050: done with get_vars() 12033 1726867178.80300: done processing included file 12033 1726867178.80302: iterating over new_blocks loaded from include file 12033 1726867178.80303: in VariableManager get_vars() 12033 1726867178.80382: done with get_vars() 12033 1726867178.80385: filtering new block on tags 12033 1726867178.80450: done filtering new block on tags 12033 1726867178.80453: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.0) 12033 1726867178.80457: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12033 1726867178.80458: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12033 1726867178.80461: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12033 1726867178.80569: in VariableManager get_vars() 12033 1726867178.80592: done with get_vars() 12033 1726867178.80837: done processing included file 12033 1726867178.80839: iterating over new_blocks loaded from include file 12033 1726867178.80841: in VariableManager get_vars() 12033 1726867178.80854: done with get_vars() 12033 1726867178.80856: filtering new block on tags 12033 1726867178.80919: done filtering new block on tags 12033 1726867178.80922: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.1) 12033 1726867178.80925: extending task lists for all hosts with included blocks 12033 1726867178.81043: done extending task lists 12033 1726867178.81044: done processing included files 12033 1726867178.81045: results queue empty 12033 1726867178.81046: checking for any_errors_fatal 12033 1726867178.81049: done checking for any_errors_fatal 12033 1726867178.81050: checking for max_fail_percentage 12033 1726867178.81051: done checking for max_fail_percentage 12033 1726867178.81052: checking to see if all hosts have failed and the running result is not ok 12033 1726867178.81052: done checking to see if all hosts have failed 12033 1726867178.81053: getting the remaining hosts for this loop 12033 1726867178.81054: done getting the remaining hosts for this loop 12033 1726867178.81057: getting the next task for host managed_node3 12033 1726867178.81061: done getting next task for host managed_node3 12033 1726867178.81063: ^ task is: TASK: Include the task 'get_profile_stat.yml' 12033 1726867178.81066: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867178.81069: getting variables 12033 1726867178.81069: in VariableManager get_vars() 12033 1726867178.81080: Calling all_inventory to load vars for managed_node3 12033 1726867178.81082: Calling groups_inventory to load vars for managed_node3 12033 1726867178.81084: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867178.81092: Calling all_plugins_play to load vars for managed_node3 12033 1726867178.81100: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867178.81104: Calling groups_plugins_play to load vars for managed_node3 12033 1726867178.82332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867178.83881: done with get_vars() 12033 1726867178.83900: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 17:19:38 -0400 (0:00:00.116) 0:00:17.955 ****** 12033 1726867178.83973: entering _queue_task() for managed_node3/include_tasks 12033 1726867178.84328: worker is 1 (out of 1 available) 12033 1726867178.84340: exiting _queue_task() for managed_node3/include_tasks 12033 1726867178.84357: done queuing things up, now waiting for results queue to drain 12033 1726867178.84359: waiting for pending results... 12033 1726867178.84658: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 12033 1726867178.84809: in run() - task 0affcac9-a3a5-74bb-502b-0000000004d9 12033 1726867178.84832: variable 'ansible_search_path' from source: unknown 12033 1726867178.84841: variable 'ansible_search_path' from source: unknown 12033 1726867178.84904: calling self._execute() 12033 1726867178.84996: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.85118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.85123: variable 'omit' from source: magic vars 12033 1726867178.85452: variable 'ansible_distribution_major_version' from source: facts 12033 1726867178.85472: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867178.85486: _execute() done 12033 1726867178.85553: dumping result to json 12033 1726867178.85557: done dumping result, returning 12033 1726867178.85559: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-74bb-502b-0000000004d9] 12033 1726867178.85565: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004d9 12033 1726867178.85639: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004d9 12033 1726867178.85642: WORKER PROCESS EXITING 12033 1726867178.85702: no more pending results, returning what we have 12033 1726867178.85708: in VariableManager get_vars() 12033 1726867178.85746: Calling all_inventory to load vars for managed_node3 12033 1726867178.85749: Calling groups_inventory to load vars for managed_node3 12033 1726867178.85752: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867178.85981: Calling all_plugins_play to load vars for managed_node3 12033 1726867178.85985: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867178.85992: Calling groups_plugins_play to load vars for managed_node3 12033 1726867178.87368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867178.88961: done with get_vars() 12033 1726867178.88980: variable 'ansible_search_path' from source: unknown 12033 1726867178.88982: variable 'ansible_search_path' from source: unknown 12033 1726867178.89024: we have included files to process 12033 1726867178.89025: generating all_blocks data 12033 1726867178.89027: done generating all_blocks data 12033 1726867178.89028: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12033 1726867178.89029: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12033 1726867178.89031: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12033 1726867178.90143: done processing included file 12033 1726867178.90145: iterating over new_blocks loaded from include file 12033 1726867178.90146: in VariableManager get_vars() 12033 1726867178.90163: done with get_vars() 12033 1726867178.90165: filtering new block on tags 12033 1726867178.90318: done filtering new block on tags 12033 1726867178.90321: in VariableManager get_vars() 12033 1726867178.90338: done with get_vars() 12033 1726867178.90340: filtering new block on tags 12033 1726867178.90406: done filtering new block on tags 12033 1726867178.90408: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 12033 1726867178.90414: extending task lists for all hosts with included blocks 12033 1726867178.90814: done extending task lists 12033 1726867178.90815: done processing included files 12033 1726867178.90816: results queue empty 12033 1726867178.90817: checking for any_errors_fatal 12033 1726867178.90821: done checking for any_errors_fatal 12033 1726867178.90822: checking for max_fail_percentage 12033 1726867178.90823: done checking for max_fail_percentage 12033 1726867178.90824: checking to see if all hosts have failed and the running result is not ok 12033 1726867178.90825: done checking to see if all hosts have failed 12033 1726867178.90826: getting the remaining hosts for this loop 12033 1726867178.90827: done getting the remaining hosts for this loop 12033 1726867178.90829: getting the next task for host managed_node3 12033 1726867178.90835: done getting next task for host managed_node3 12033 1726867178.90837: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 12033 1726867178.90841: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867178.90844: getting variables 12033 1726867178.90845: in VariableManager get_vars() 12033 1726867178.90859: Calling all_inventory to load vars for managed_node3 12033 1726867178.90861: Calling groups_inventory to load vars for managed_node3 12033 1726867178.90863: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867178.90869: Calling all_plugins_play to load vars for managed_node3 12033 1726867178.90871: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867178.90874: Calling groups_plugins_play to load vars for managed_node3 12033 1726867178.92107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867178.93675: done with get_vars() 12033 1726867178.93699: done getting variables 12033 1726867178.93740: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:19:38 -0400 (0:00:00.097) 0:00:18.053 ****** 12033 1726867178.93773: entering _queue_task() for managed_node3/set_fact 12033 1726867178.94299: worker is 1 (out of 1 available) 12033 1726867178.94308: exiting _queue_task() for managed_node3/set_fact 12033 1726867178.94318: done queuing things up, now waiting for results queue to drain 12033 1726867178.94320: waiting for pending results... 12033 1726867178.94561: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 12033 1726867178.94583: in run() - task 0affcac9-a3a5-74bb-502b-0000000004fc 12033 1726867178.94609: variable 'ansible_search_path' from source: unknown 12033 1726867178.94616: variable 'ansible_search_path' from source: unknown 12033 1726867178.94661: calling self._execute() 12033 1726867178.94753: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.94775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.94797: variable 'omit' from source: magic vars 12033 1726867178.95218: variable 'ansible_distribution_major_version' from source: facts 12033 1726867178.95237: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867178.95249: variable 'omit' from source: magic vars 12033 1726867178.95382: variable 'omit' from source: magic vars 12033 1726867178.95386: variable 'omit' from source: magic vars 12033 1726867178.95407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867178.95457: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867178.95486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867178.95513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867178.95538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867178.95573: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867178.95586: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.95599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.95714: Set connection var ansible_pipelining to False 12033 1726867178.95730: Set connection var ansible_shell_executable to /bin/sh 12033 1726867178.95781: Set connection var ansible_timeout to 10 12033 1726867178.95784: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867178.95787: Set connection var ansible_connection to ssh 12033 1726867178.95791: Set connection var ansible_shell_type to sh 12033 1726867178.95802: variable 'ansible_shell_executable' from source: unknown 12033 1726867178.95810: variable 'ansible_connection' from source: unknown 12033 1726867178.95818: variable 'ansible_module_compression' from source: unknown 12033 1726867178.95825: variable 'ansible_shell_type' from source: unknown 12033 1726867178.95860: variable 'ansible_shell_executable' from source: unknown 12033 1726867178.95863: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867178.95865: variable 'ansible_pipelining' from source: unknown 12033 1726867178.95867: variable 'ansible_timeout' from source: unknown 12033 1726867178.95869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867178.96017: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867178.96034: variable 'omit' from source: magic vars 12033 1726867178.96080: starting attempt loop 12033 1726867178.96083: running the handler 12033 1726867178.96086: handler run complete 12033 1726867178.96091: attempt loop complete, returning result 12033 1726867178.96094: _execute() done 12033 1726867178.96099: dumping result to json 12033 1726867178.96108: done dumping result, returning 12033 1726867178.96186: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-74bb-502b-0000000004fc] 12033 1726867178.96192: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004fc 12033 1726867178.96258: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004fc 12033 1726867178.96262: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 12033 1726867178.96348: no more pending results, returning what we have 12033 1726867178.96353: results queue empty 12033 1726867178.96354: checking for any_errors_fatal 12033 1726867178.96356: done checking for any_errors_fatal 12033 1726867178.96356: checking for max_fail_percentage 12033 1726867178.96358: done checking for max_fail_percentage 12033 1726867178.96360: checking to see if all hosts have failed and the running result is not ok 12033 1726867178.96361: done checking to see if all hosts have failed 12033 1726867178.96361: getting the remaining hosts for this loop 12033 1726867178.96364: done getting the remaining hosts for this loop 12033 1726867178.96367: getting the next task for host managed_node3 12033 1726867178.96379: done getting next task for host managed_node3 12033 1726867178.96382: ^ task is: TASK: Stat profile file 12033 1726867178.96388: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867178.96395: getting variables 12033 1726867178.96397: in VariableManager get_vars() 12033 1726867178.96520: Calling all_inventory to load vars for managed_node3 12033 1726867178.96523: Calling groups_inventory to load vars for managed_node3 12033 1726867178.96527: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867178.96538: Calling all_plugins_play to load vars for managed_node3 12033 1726867178.96542: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867178.96545: Calling groups_plugins_play to load vars for managed_node3 12033 1726867178.98137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867179.01211: done with get_vars() 12033 1726867179.01231: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:19:39 -0400 (0:00:00.075) 0:00:18.129 ****** 12033 1726867179.01340: entering _queue_task() for managed_node3/stat 12033 1726867179.01759: worker is 1 (out of 1 available) 12033 1726867179.01773: exiting _queue_task() for managed_node3/stat 12033 1726867179.01903: done queuing things up, now waiting for results queue to drain 12033 1726867179.01905: waiting for pending results... 12033 1726867179.02203: running TaskExecutor() for managed_node3/TASK: Stat profile file 12033 1726867179.02306: in run() - task 0affcac9-a3a5-74bb-502b-0000000004fd 12033 1726867179.02333: variable 'ansible_search_path' from source: unknown 12033 1726867179.02384: variable 'ansible_search_path' from source: unknown 12033 1726867179.02388: calling self._execute() 12033 1726867179.02499: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867179.02511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867179.02525: variable 'omit' from source: magic vars 12033 1726867179.02933: variable 'ansible_distribution_major_version' from source: facts 12033 1726867179.02986: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867179.02991: variable 'omit' from source: magic vars 12033 1726867179.03027: variable 'omit' from source: magic vars 12033 1726867179.03126: variable 'profile' from source: include params 12033 1726867179.03183: variable 'bond_port_profile' from source: include params 12033 1726867179.03209: variable 'bond_port_profile' from source: include params 12033 1726867179.03233: variable 'omit' from source: magic vars 12033 1726867179.03272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867179.03319: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867179.03344: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867179.03383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867179.03385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867179.03412: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867179.03425: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867179.03482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867179.03548: Set connection var ansible_pipelining to False 12033 1726867179.03563: Set connection var ansible_shell_executable to /bin/sh 12033 1726867179.03576: Set connection var ansible_timeout to 10 12033 1726867179.03594: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867179.03602: Set connection var ansible_connection to ssh 12033 1726867179.03612: Set connection var ansible_shell_type to sh 12033 1726867179.03642: variable 'ansible_shell_executable' from source: unknown 12033 1726867179.03655: variable 'ansible_connection' from source: unknown 12033 1726867179.03747: variable 'ansible_module_compression' from source: unknown 12033 1726867179.03750: variable 'ansible_shell_type' from source: unknown 12033 1726867179.03754: variable 'ansible_shell_executable' from source: unknown 12033 1726867179.03757: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867179.03759: variable 'ansible_pipelining' from source: unknown 12033 1726867179.03761: variable 'ansible_timeout' from source: unknown 12033 1726867179.03763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867179.03920: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867179.03937: variable 'omit' from source: magic vars 12033 1726867179.03948: starting attempt loop 12033 1726867179.03954: running the handler 12033 1726867179.04027: _low_level_execute_command(): starting 12033 1726867179.04041: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867179.05886: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867179.05892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867179.05895: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867179.05897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867179.06003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867179.06415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867179.06798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867179.08427: stdout chunk (state=3): >>>/root <<< 12033 1726867179.08430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867179.08583: stderr chunk (state=3): >>><<< 12033 1726867179.08587: stdout chunk (state=3): >>><<< 12033 1726867179.08594: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867179.08597: _low_level_execute_command(): starting 12033 1726867179.08600: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867179.0848105-12879-224421496767027 `" && echo ansible-tmp-1726867179.0848105-12879-224421496767027="` echo /root/.ansible/tmp/ansible-tmp-1726867179.0848105-12879-224421496767027 `" ) && sleep 0' 12033 1726867179.09582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867179.09617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867179.09696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867179.09796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867179.09838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867179.11733: stdout chunk (state=3): >>>ansible-tmp-1726867179.0848105-12879-224421496767027=/root/.ansible/tmp/ansible-tmp-1726867179.0848105-12879-224421496767027 <<< 12033 1726867179.11926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867179.11930: stdout chunk (state=3): >>><<< 12033 1726867179.11938: stderr chunk (state=3): >>><<< 12033 1726867179.11956: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867179.0848105-12879-224421496767027=/root/.ansible/tmp/ansible-tmp-1726867179.0848105-12879-224421496767027 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867179.12004: variable 'ansible_module_compression' from source: unknown 12033 1726867179.12063: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12033 1726867179.12306: variable 'ansible_facts' from source: unknown 12033 1726867179.12543: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867179.0848105-12879-224421496767027/AnsiballZ_stat.py 12033 1726867179.12840: Sending initial data 12033 1726867179.12843: Sent initial data (153 bytes) 12033 1726867179.13362: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867179.13371: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867179.13385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867179.13405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867179.13548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867179.13552: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867179.13554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867179.13556: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867179.13560: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867179.13563: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867179.13565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867179.13652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867179.13764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867179.15295: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12033 1726867179.15324: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867179.15370: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867179.15430: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpn9uy_2w1 /root/.ansible/tmp/ansible-tmp-1726867179.0848105-12879-224421496767027/AnsiballZ_stat.py <<< 12033 1726867179.15433: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867179.0848105-12879-224421496767027/AnsiballZ_stat.py" <<< 12033 1726867179.15458: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpn9uy_2w1" to remote "/root/.ansible/tmp/ansible-tmp-1726867179.0848105-12879-224421496767027/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867179.0848105-12879-224421496767027/AnsiballZ_stat.py" <<< 12033 1726867179.16241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867179.16291: stderr chunk (state=3): >>><<< 12033 1726867179.16301: stdout chunk (state=3): >>><<< 12033 1726867179.16394: done transferring module to remote 12033 1726867179.16437: _low_level_execute_command(): starting 12033 1726867179.16440: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867179.0848105-12879-224421496767027/ /root/.ansible/tmp/ansible-tmp-1726867179.0848105-12879-224421496767027/AnsiballZ_stat.py && sleep 0' 12033 1726867179.17406: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867179.17421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867179.17636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867179.17657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867179.17676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867179.17759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867179.19546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867179.19558: stdout chunk (state=3): >>><<< 12033 1726867179.19571: stderr chunk (state=3): >>><<< 12033 1726867179.19598: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867179.19617: _low_level_execute_command(): starting 12033 1726867179.19629: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867179.0848105-12879-224421496767027/AnsiballZ_stat.py && sleep 0' 12033 1726867179.20274: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867179.20324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867179.20338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867179.20357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867179.20599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867179.35624: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12033 1726867179.36958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867179.36967: stdout chunk (state=3): >>><<< 12033 1726867179.36988: stderr chunk (state=3): >>><<< 12033 1726867179.37009: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867179.37040: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867179.0848105-12879-224421496767027/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867179.37052: _low_level_execute_command(): starting 12033 1726867179.37059: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867179.0848105-12879-224421496767027/ > /dev/null 2>&1 && sleep 0' 12033 1726867179.37659: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867179.37674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867179.37696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867179.37716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867179.37829: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867179.37855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867179.37873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867179.37963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867179.39803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867179.39880: stderr chunk (state=3): >>><<< 12033 1726867179.39884: stdout chunk (state=3): >>><<< 12033 1726867179.40083: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867179.40087: handler run complete 12033 1726867179.40093: attempt loop complete, returning result 12033 1726867179.40095: _execute() done 12033 1726867179.40097: dumping result to json 12033 1726867179.40100: done dumping result, returning 12033 1726867179.40102: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcac9-a3a5-74bb-502b-0000000004fd] 12033 1726867179.40104: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004fd 12033 1726867179.40176: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004fd 12033 1726867179.40182: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 12033 1726867179.40247: no more pending results, returning what we have 12033 1726867179.40251: results queue empty 12033 1726867179.40252: checking for any_errors_fatal 12033 1726867179.40260: done checking for any_errors_fatal 12033 1726867179.40261: checking for max_fail_percentage 12033 1726867179.40263: done checking for max_fail_percentage 12033 1726867179.40265: checking to see if all hosts have failed and the running result is not ok 12033 1726867179.40265: done checking to see if all hosts have failed 12033 1726867179.40266: getting the remaining hosts for this loop 12033 1726867179.40268: done getting the remaining hosts for this loop 12033 1726867179.40272: getting the next task for host managed_node3 12033 1726867179.40289: done getting next task for host managed_node3 12033 1726867179.40294: ^ task is: TASK: Set NM profile exist flag based on the profile files 12033 1726867179.40301: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867179.40306: getting variables 12033 1726867179.40308: in VariableManager get_vars() 12033 1726867179.40341: Calling all_inventory to load vars for managed_node3 12033 1726867179.40344: Calling groups_inventory to load vars for managed_node3 12033 1726867179.40348: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867179.40360: Calling all_plugins_play to load vars for managed_node3 12033 1726867179.40363: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867179.40365: Calling groups_plugins_play to load vars for managed_node3 12033 1726867179.42005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867179.43895: done with get_vars() 12033 1726867179.43918: done getting variables 12033 1726867179.43982: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:19:39 -0400 (0:00:00.426) 0:00:18.555 ****** 12033 1726867179.44018: entering _queue_task() for managed_node3/set_fact 12033 1726867179.44331: worker is 1 (out of 1 available) 12033 1726867179.44345: exiting _queue_task() for managed_node3/set_fact 12033 1726867179.44358: done queuing things up, now waiting for results queue to drain 12033 1726867179.44360: waiting for pending results... 12033 1726867179.44697: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 12033 1726867179.44716: in run() - task 0affcac9-a3a5-74bb-502b-0000000004fe 12033 1726867179.44752: variable 'ansible_search_path' from source: unknown 12033 1726867179.44756: variable 'ansible_search_path' from source: unknown 12033 1726867179.44766: calling self._execute() 12033 1726867179.44883: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867179.44887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867179.44894: variable 'omit' from source: magic vars 12033 1726867179.45230: variable 'ansible_distribution_major_version' from source: facts 12033 1726867179.45284: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867179.45371: variable 'profile_stat' from source: set_fact 12033 1726867179.45384: Evaluated conditional (profile_stat.stat.exists): False 12033 1726867179.45387: when evaluation is False, skipping this task 12033 1726867179.45393: _execute() done 12033 1726867179.45396: dumping result to json 12033 1726867179.45484: done dumping result, returning 12033 1726867179.45487: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-74bb-502b-0000000004fe] 12033 1726867179.45491: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004fe 12033 1726867179.45551: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004fe 12033 1726867179.45554: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12033 1726867179.45602: no more pending results, returning what we have 12033 1726867179.45606: results queue empty 12033 1726867179.45607: checking for any_errors_fatal 12033 1726867179.45612: done checking for any_errors_fatal 12033 1726867179.45613: checking for max_fail_percentage 12033 1726867179.45614: done checking for max_fail_percentage 12033 1726867179.45615: checking to see if all hosts have failed and the running result is not ok 12033 1726867179.45616: done checking to see if all hosts have failed 12033 1726867179.45616: getting the remaining hosts for this loop 12033 1726867179.45617: done getting the remaining hosts for this loop 12033 1726867179.45620: getting the next task for host managed_node3 12033 1726867179.45626: done getting next task for host managed_node3 12033 1726867179.45629: ^ task is: TASK: Get NM profile info 12033 1726867179.45633: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867179.45637: getting variables 12033 1726867179.45638: in VariableManager get_vars() 12033 1726867179.45662: Calling all_inventory to load vars for managed_node3 12033 1726867179.45665: Calling groups_inventory to load vars for managed_node3 12033 1726867179.45667: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867179.45676: Calling all_plugins_play to load vars for managed_node3 12033 1726867179.45684: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867179.45688: Calling groups_plugins_play to load vars for managed_node3 12033 1726867179.47134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867179.48719: done with get_vars() 12033 1726867179.48741: done getting variables 12033 1726867179.48807: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:19:39 -0400 (0:00:00.048) 0:00:18.604 ****** 12033 1726867179.48845: entering _queue_task() for managed_node3/shell 12033 1726867179.49140: worker is 1 (out of 1 available) 12033 1726867179.49160: exiting _queue_task() for managed_node3/shell 12033 1726867179.49171: done queuing things up, now waiting for results queue to drain 12033 1726867179.49173: waiting for pending results... 12033 1726867179.49496: running TaskExecutor() for managed_node3/TASK: Get NM profile info 12033 1726867179.49535: in run() - task 0affcac9-a3a5-74bb-502b-0000000004ff 12033 1726867179.49586: variable 'ansible_search_path' from source: unknown 12033 1726867179.49595: variable 'ansible_search_path' from source: unknown 12033 1726867179.49598: calling self._execute() 12033 1726867179.49696: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867179.49700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867179.49704: variable 'omit' from source: magic vars 12033 1726867179.50053: variable 'ansible_distribution_major_version' from source: facts 12033 1726867179.50132: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867179.50135: variable 'omit' from source: magic vars 12033 1726867179.50138: variable 'omit' from source: magic vars 12033 1726867179.50240: variable 'profile' from source: include params 12033 1726867179.50244: variable 'bond_port_profile' from source: include params 12033 1726867179.50309: variable 'bond_port_profile' from source: include params 12033 1726867179.50329: variable 'omit' from source: magic vars 12033 1726867179.50367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867179.50409: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867179.50428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867179.50446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867179.50457: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867179.50564: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867179.50568: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867179.50571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867179.50597: Set connection var ansible_pipelining to False 12033 1726867179.50606: Set connection var ansible_shell_executable to /bin/sh 12033 1726867179.50619: Set connection var ansible_timeout to 10 12033 1726867179.50624: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867179.50627: Set connection var ansible_connection to ssh 12033 1726867179.50632: Set connection var ansible_shell_type to sh 12033 1726867179.50653: variable 'ansible_shell_executable' from source: unknown 12033 1726867179.50656: variable 'ansible_connection' from source: unknown 12033 1726867179.50658: variable 'ansible_module_compression' from source: unknown 12033 1726867179.50661: variable 'ansible_shell_type' from source: unknown 12033 1726867179.50663: variable 'ansible_shell_executable' from source: unknown 12033 1726867179.50666: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867179.50669: variable 'ansible_pipelining' from source: unknown 12033 1726867179.50671: variable 'ansible_timeout' from source: unknown 12033 1726867179.50798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867179.50814: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867179.50824: variable 'omit' from source: magic vars 12033 1726867179.50834: starting attempt loop 12033 1726867179.50838: running the handler 12033 1726867179.50847: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867179.50866: _low_level_execute_command(): starting 12033 1726867179.50874: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867179.51576: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867179.51592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867179.51600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867179.51681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867179.51684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867179.51687: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867179.51692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867179.51694: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867179.51696: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867179.51698: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867179.51702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867179.51742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867179.51756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867179.51807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867179.51853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867179.53555: stdout chunk (state=3): >>>/root <<< 12033 1726867179.53718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867179.53722: stdout chunk (state=3): >>><<< 12033 1726867179.53726: stderr chunk (state=3): >>><<< 12033 1726867179.53747: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867179.53847: _low_level_execute_command(): starting 12033 1726867179.53851: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867179.5375955-12912-21009977195173 `" && echo ansible-tmp-1726867179.5375955-12912-21009977195173="` echo /root/.ansible/tmp/ansible-tmp-1726867179.5375955-12912-21009977195173 `" ) && sleep 0' 12033 1726867179.54436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867179.54452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867179.54467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867179.54488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867179.54529: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867179.54598: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867179.54646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867179.54662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867179.54687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867179.54772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867179.56697: stdout chunk (state=3): >>>ansible-tmp-1726867179.5375955-12912-21009977195173=/root/.ansible/tmp/ansible-tmp-1726867179.5375955-12912-21009977195173 <<< 12033 1726867179.56856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867179.56862: stdout chunk (state=3): >>><<< 12033 1726867179.56865: stderr chunk (state=3): >>><<< 12033 1726867179.56888: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867179.5375955-12912-21009977195173=/root/.ansible/tmp/ansible-tmp-1726867179.5375955-12912-21009977195173 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867179.57083: variable 'ansible_module_compression' from source: unknown 12033 1726867179.57086: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867179.57088: variable 'ansible_facts' from source: unknown 12033 1726867179.57111: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867179.5375955-12912-21009977195173/AnsiballZ_command.py 12033 1726867179.57339: Sending initial data 12033 1726867179.57342: Sent initial data (155 bytes) 12033 1726867179.57863: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867179.57873: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867179.57886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867179.57906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867179.57996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867179.58003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867179.58020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867179.58029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867179.58107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867179.59673: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867179.59734: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867179.59799: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpk7yzpvva /root/.ansible/tmp/ansible-tmp-1726867179.5375955-12912-21009977195173/AnsiballZ_command.py <<< 12033 1726867179.59803: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867179.5375955-12912-21009977195173/AnsiballZ_command.py" <<< 12033 1726867179.59852: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpk7yzpvva" to remote "/root/.ansible/tmp/ansible-tmp-1726867179.5375955-12912-21009977195173/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867179.5375955-12912-21009977195173/AnsiballZ_command.py" <<< 12033 1726867179.60921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867179.60925: stdout chunk (state=3): >>><<< 12033 1726867179.60979: stderr chunk (state=3): >>><<< 12033 1726867179.60982: done transferring module to remote 12033 1726867179.60985: _low_level_execute_command(): starting 12033 1726867179.60991: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867179.5375955-12912-21009977195173/ /root/.ansible/tmp/ansible-tmp-1726867179.5375955-12912-21009977195173/AnsiballZ_command.py && sleep 0' 12033 1726867179.61544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867179.61553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867179.61657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867179.61701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867179.61747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867179.63504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867179.63682: stderr chunk (state=3): >>><<< 12033 1726867179.63686: stdout chunk (state=3): >>><<< 12033 1726867179.63691: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867179.63695: _low_level_execute_command(): starting 12033 1726867179.63697: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867179.5375955-12912-21009977195173/AnsiballZ_command.py && sleep 0' 12033 1726867179.64195: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867179.64211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867179.64228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867179.64253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867179.64271: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867179.64320: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867179.64387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867179.64412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867179.64512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867179.81785: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 17:19:39.794237", "end": "2024-09-20 17:19:39.814804", "delta": "0:00:00.020567", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867179.83253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867179.83258: stdout chunk (state=3): >>><<< 12033 1726867179.83263: stderr chunk (state=3): >>><<< 12033 1726867179.83405: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 17:19:39.794237", "end": "2024-09-20 17:19:39.814804", "delta": "0:00:00.020567", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867179.83436: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867179.5375955-12912-21009977195173/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867179.83444: _low_level_execute_command(): starting 12033 1726867179.83449: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867179.5375955-12912-21009977195173/ > /dev/null 2>&1 && sleep 0' 12033 1726867179.84554: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867179.84560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867179.84605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867179.84693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867179.84896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867179.84966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867179.87083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867179.87086: stdout chunk (state=3): >>><<< 12033 1726867179.87089: stderr chunk (state=3): >>><<< 12033 1726867179.87091: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867179.87095: handler run complete 12033 1726867179.87097: Evaluated conditional (False): False 12033 1726867179.87100: attempt loop complete, returning result 12033 1726867179.87103: _execute() done 12033 1726867179.87105: dumping result to json 12033 1726867179.87108: done dumping result, returning 12033 1726867179.87111: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcac9-a3a5-74bb-502b-0000000004ff] 12033 1726867179.87113: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004ff 12033 1726867179.87383: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004ff ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.020567", "end": "2024-09-20 17:19:39.814804", "rc": 0, "start": "2024-09-20 17:19:39.794237" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 12033 1726867179.87463: no more pending results, returning what we have 12033 1726867179.87467: results queue empty 12033 1726867179.87468: checking for any_errors_fatal 12033 1726867179.87476: done checking for any_errors_fatal 12033 1726867179.87476: checking for max_fail_percentage 12033 1726867179.87582: done checking for max_fail_percentage 12033 1726867179.87584: checking to see if all hosts have failed and the running result is not ok 12033 1726867179.87584: done checking to see if all hosts have failed 12033 1726867179.87585: getting the remaining hosts for this loop 12033 1726867179.87587: done getting the remaining hosts for this loop 12033 1726867179.87593: getting the next task for host managed_node3 12033 1726867179.87600: done getting next task for host managed_node3 12033 1726867179.87603: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12033 1726867179.87609: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867179.87612: getting variables 12033 1726867179.87613: in VariableManager get_vars() 12033 1726867179.87643: Calling all_inventory to load vars for managed_node3 12033 1726867179.87646: Calling groups_inventory to load vars for managed_node3 12033 1726867179.87649: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867179.87660: Calling all_plugins_play to load vars for managed_node3 12033 1726867179.87663: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867179.87666: Calling groups_plugins_play to load vars for managed_node3 12033 1726867179.87672: WORKER PROCESS EXITING 12033 1726867179.90874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867179.92840: done with get_vars() 12033 1726867179.92865: done getting variables 12033 1726867179.93005: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:19:39 -0400 (0:00:00.441) 0:00:19.046 ****** 12033 1726867179.93039: entering _queue_task() for managed_node3/set_fact 12033 1726867179.93642: worker is 1 (out of 1 available) 12033 1726867179.93656: exiting _queue_task() for managed_node3/set_fact 12033 1726867179.93668: done queuing things up, now waiting for results queue to drain 12033 1726867179.93670: waiting for pending results... 12033 1726867179.94116: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12033 1726867179.94438: in run() - task 0affcac9-a3a5-74bb-502b-000000000500 12033 1726867179.94453: variable 'ansible_search_path' from source: unknown 12033 1726867179.94457: variable 'ansible_search_path' from source: unknown 12033 1726867179.94614: calling self._execute() 12033 1726867179.94809: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867179.94820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867179.94830: variable 'omit' from source: magic vars 12033 1726867179.95419: variable 'ansible_distribution_major_version' from source: facts 12033 1726867179.95431: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867179.95570: variable 'nm_profile_exists' from source: set_fact 12033 1726867179.95589: Evaluated conditional (nm_profile_exists.rc == 0): True 12033 1726867179.95598: variable 'omit' from source: magic vars 12033 1726867179.95661: variable 'omit' from source: magic vars 12033 1726867179.95701: variable 'omit' from source: magic vars 12033 1726867179.95740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867179.95771: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867179.95806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867179.95822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867179.95833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867179.95865: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867179.95869: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867179.95871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867179.96202: Set connection var ansible_pipelining to False 12033 1726867179.96205: Set connection var ansible_shell_executable to /bin/sh 12033 1726867179.96208: Set connection var ansible_timeout to 10 12033 1726867179.96210: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867179.96212: Set connection var ansible_connection to ssh 12033 1726867179.96214: Set connection var ansible_shell_type to sh 12033 1726867179.96217: variable 'ansible_shell_executable' from source: unknown 12033 1726867179.96220: variable 'ansible_connection' from source: unknown 12033 1726867179.96223: variable 'ansible_module_compression' from source: unknown 12033 1726867179.96225: variable 'ansible_shell_type' from source: unknown 12033 1726867179.96228: variable 'ansible_shell_executable' from source: unknown 12033 1726867179.96230: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867179.96232: variable 'ansible_pipelining' from source: unknown 12033 1726867179.96234: variable 'ansible_timeout' from source: unknown 12033 1726867179.96236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867179.96239: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867179.96244: variable 'omit' from source: magic vars 12033 1726867179.96249: starting attempt loop 12033 1726867179.96252: running the handler 12033 1726867179.96263: handler run complete 12033 1726867179.96272: attempt loop complete, returning result 12033 1726867179.96275: _execute() done 12033 1726867179.96279: dumping result to json 12033 1726867179.96281: done dumping result, returning 12033 1726867179.96292: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-74bb-502b-000000000500] 12033 1726867179.96298: sending task result for task 0affcac9-a3a5-74bb-502b-000000000500 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 12033 1726867179.96538: no more pending results, returning what we have 12033 1726867179.96541: results queue empty 12033 1726867179.96542: checking for any_errors_fatal 12033 1726867179.96550: done checking for any_errors_fatal 12033 1726867179.96551: checking for max_fail_percentage 12033 1726867179.96553: done checking for max_fail_percentage 12033 1726867179.96554: checking to see if all hosts have failed and the running result is not ok 12033 1726867179.96555: done checking to see if all hosts have failed 12033 1726867179.96556: getting the remaining hosts for this loop 12033 1726867179.96558: done getting the remaining hosts for this loop 12033 1726867179.96561: getting the next task for host managed_node3 12033 1726867179.96571: done getting next task for host managed_node3 12033 1726867179.96573: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 12033 1726867179.96580: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867179.96583: getting variables 12033 1726867179.96585: in VariableManager get_vars() 12033 1726867179.96615: Calling all_inventory to load vars for managed_node3 12033 1726867179.96617: Calling groups_inventory to load vars for managed_node3 12033 1726867179.96620: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867179.96631: Calling all_plugins_play to load vars for managed_node3 12033 1726867179.96633: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867179.96636: Calling groups_plugins_play to load vars for managed_node3 12033 1726867179.97600: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000500 12033 1726867179.97604: WORKER PROCESS EXITING 12033 1726867179.99420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867180.01718: done with get_vars() 12033 1726867180.01754: done getting variables 12033 1726867180.01816: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867180.01947: variable 'profile' from source: include params 12033 1726867180.01951: variable 'bond_port_profile' from source: include params 12033 1726867180.02010: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:19:40 -0400 (0:00:00.090) 0:00:19.136 ****** 12033 1726867180.02043: entering _queue_task() for managed_node3/command 12033 1726867180.02361: worker is 1 (out of 1 available) 12033 1726867180.02373: exiting _queue_task() for managed_node3/command 12033 1726867180.02387: done queuing things up, now waiting for results queue to drain 12033 1726867180.02389: waiting for pending results... 12033 1726867180.02674: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 12033 1726867180.02831: in run() - task 0affcac9-a3a5-74bb-502b-000000000502 12033 1726867180.02853: variable 'ansible_search_path' from source: unknown 12033 1726867180.02862: variable 'ansible_search_path' from source: unknown 12033 1726867180.02906: calling self._execute() 12033 1726867180.03002: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.03014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.03037: variable 'omit' from source: magic vars 12033 1726867180.03401: variable 'ansible_distribution_major_version' from source: facts 12033 1726867180.03419: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867180.03551: variable 'profile_stat' from source: set_fact 12033 1726867180.03575: Evaluated conditional (profile_stat.stat.exists): False 12033 1726867180.03587: when evaluation is False, skipping this task 12033 1726867180.03597: _execute() done 12033 1726867180.03683: dumping result to json 12033 1726867180.03686: done dumping result, returning 12033 1726867180.03688: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 [0affcac9-a3a5-74bb-502b-000000000502] 12033 1726867180.03690: sending task result for task 0affcac9-a3a5-74bb-502b-000000000502 12033 1726867180.03759: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000502 12033 1726867180.03763: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12033 1726867180.03830: no more pending results, returning what we have 12033 1726867180.03834: results queue empty 12033 1726867180.03835: checking for any_errors_fatal 12033 1726867180.03843: done checking for any_errors_fatal 12033 1726867180.03844: checking for max_fail_percentage 12033 1726867180.03847: done checking for max_fail_percentage 12033 1726867180.03848: checking to see if all hosts have failed and the running result is not ok 12033 1726867180.03848: done checking to see if all hosts have failed 12033 1726867180.03849: getting the remaining hosts for this loop 12033 1726867180.03851: done getting the remaining hosts for this loop 12033 1726867180.03855: getting the next task for host managed_node3 12033 1726867180.03864: done getting next task for host managed_node3 12033 1726867180.03866: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 12033 1726867180.03873: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867180.03880: getting variables 12033 1726867180.03882: in VariableManager get_vars() 12033 1726867180.03915: Calling all_inventory to load vars for managed_node3 12033 1726867180.03918: Calling groups_inventory to load vars for managed_node3 12033 1726867180.03921: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867180.03934: Calling all_plugins_play to load vars for managed_node3 12033 1726867180.03937: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867180.03940: Calling groups_plugins_play to load vars for managed_node3 12033 1726867180.05431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867180.07044: done with get_vars() 12033 1726867180.07068: done getting variables 12033 1726867180.07133: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867180.07248: variable 'profile' from source: include params 12033 1726867180.07252: variable 'bond_port_profile' from source: include params 12033 1726867180.07313: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:19:40 -0400 (0:00:00.053) 0:00:19.189 ****** 12033 1726867180.07354: entering _queue_task() for managed_node3/set_fact 12033 1726867180.07782: worker is 1 (out of 1 available) 12033 1726867180.07793: exiting _queue_task() for managed_node3/set_fact 12033 1726867180.07803: done queuing things up, now waiting for results queue to drain 12033 1726867180.07804: waiting for pending results... 12033 1726867180.08015: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 12033 1726867180.08483: in run() - task 0affcac9-a3a5-74bb-502b-000000000503 12033 1726867180.08487: variable 'ansible_search_path' from source: unknown 12033 1726867180.08489: variable 'ansible_search_path' from source: unknown 12033 1726867180.08491: calling self._execute() 12033 1726867180.08607: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.08610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.08613: variable 'omit' from source: magic vars 12033 1726867180.09214: variable 'ansible_distribution_major_version' from source: facts 12033 1726867180.09232: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867180.09468: variable 'profile_stat' from source: set_fact 12033 1726867180.09593: Evaluated conditional (profile_stat.stat.exists): False 12033 1726867180.09601: when evaluation is False, skipping this task 12033 1726867180.09608: _execute() done 12033 1726867180.09615: dumping result to json 12033 1726867180.09622: done dumping result, returning 12033 1726867180.09631: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 [0affcac9-a3a5-74bb-502b-000000000503] 12033 1726867180.09641: sending task result for task 0affcac9-a3a5-74bb-502b-000000000503 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12033 1726867180.09781: no more pending results, returning what we have 12033 1726867180.09785: results queue empty 12033 1726867180.09787: checking for any_errors_fatal 12033 1726867180.09793: done checking for any_errors_fatal 12033 1726867180.09794: checking for max_fail_percentage 12033 1726867180.09796: done checking for max_fail_percentage 12033 1726867180.09797: checking to see if all hosts have failed and the running result is not ok 12033 1726867180.09798: done checking to see if all hosts have failed 12033 1726867180.09799: getting the remaining hosts for this loop 12033 1726867180.09800: done getting the remaining hosts for this loop 12033 1726867180.09804: getting the next task for host managed_node3 12033 1726867180.09813: done getting next task for host managed_node3 12033 1726867180.09815: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 12033 1726867180.09822: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867180.09825: getting variables 12033 1726867180.09826: in VariableManager get_vars() 12033 1726867180.09856: Calling all_inventory to load vars for managed_node3 12033 1726867180.09859: Calling groups_inventory to load vars for managed_node3 12033 1726867180.09862: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867180.09875: Calling all_plugins_play to load vars for managed_node3 12033 1726867180.09881: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867180.09884: Calling groups_plugins_play to load vars for managed_node3 12033 1726867180.11167: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000503 12033 1726867180.11170: WORKER PROCESS EXITING 12033 1726867180.12971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867180.16288: done with get_vars() 12033 1726867180.16313: done getting variables 12033 1726867180.16559: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867180.16794: variable 'profile' from source: include params 12033 1726867180.16798: variable 'bond_port_profile' from source: include params 12033 1726867180.16860: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:19:40 -0400 (0:00:00.096) 0:00:19.285 ****** 12033 1726867180.17012: entering _queue_task() for managed_node3/command 12033 1726867180.17875: worker is 1 (out of 1 available) 12033 1726867180.17888: exiting _queue_task() for managed_node3/command 12033 1726867180.17898: done queuing things up, now waiting for results queue to drain 12033 1726867180.17899: waiting for pending results... 12033 1726867180.18200: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 12033 1726867180.18994: in run() - task 0affcac9-a3a5-74bb-502b-000000000504 12033 1726867180.18999: variable 'ansible_search_path' from source: unknown 12033 1726867180.19002: variable 'ansible_search_path' from source: unknown 12033 1726867180.19005: calling self._execute() 12033 1726867180.19429: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.19432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.19435: variable 'omit' from source: magic vars 12033 1726867180.20139: variable 'ansible_distribution_major_version' from source: facts 12033 1726867180.20214: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867180.20449: variable 'profile_stat' from source: set_fact 12033 1726867180.20466: Evaluated conditional (profile_stat.stat.exists): False 12033 1726867180.20531: when evaluation is False, skipping this task 12033 1726867180.20539: _execute() done 12033 1726867180.20546: dumping result to json 12033 1726867180.20553: done dumping result, returning 12033 1726867180.20563: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 [0affcac9-a3a5-74bb-502b-000000000504] 12033 1726867180.20572: sending task result for task 0affcac9-a3a5-74bb-502b-000000000504 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12033 1726867180.20715: no more pending results, returning what we have 12033 1726867180.20720: results queue empty 12033 1726867180.20721: checking for any_errors_fatal 12033 1726867180.20727: done checking for any_errors_fatal 12033 1726867180.20728: checking for max_fail_percentage 12033 1726867180.20731: done checking for max_fail_percentage 12033 1726867180.20732: checking to see if all hosts have failed and the running result is not ok 12033 1726867180.20732: done checking to see if all hosts have failed 12033 1726867180.20733: getting the remaining hosts for this loop 12033 1726867180.20735: done getting the remaining hosts for this loop 12033 1726867180.20738: getting the next task for host managed_node3 12033 1726867180.20747: done getting next task for host managed_node3 12033 1726867180.20749: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 12033 1726867180.20757: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867180.20762: getting variables 12033 1726867180.20763: in VariableManager get_vars() 12033 1726867180.20796: Calling all_inventory to load vars for managed_node3 12033 1726867180.20798: Calling groups_inventory to load vars for managed_node3 12033 1726867180.20802: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867180.20815: Calling all_plugins_play to load vars for managed_node3 12033 1726867180.20818: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867180.20821: Calling groups_plugins_play to load vars for managed_node3 12033 1726867180.21892: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000504 12033 1726867180.21896: WORKER PROCESS EXITING 12033 1726867180.22994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867180.26327: done with get_vars() 12033 1726867180.26351: done getting variables 12033 1726867180.26532: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867180.26714: variable 'profile' from source: include params 12033 1726867180.26718: variable 'bond_port_profile' from source: include params 12033 1726867180.26786: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:19:40 -0400 (0:00:00.098) 0:00:19.383 ****** 12033 1726867180.26820: entering _queue_task() for managed_node3/set_fact 12033 1726867180.27579: worker is 1 (out of 1 available) 12033 1726867180.27594: exiting _queue_task() for managed_node3/set_fact 12033 1726867180.27720: done queuing things up, now waiting for results queue to drain 12033 1726867180.27722: waiting for pending results... 12033 1726867180.28171: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 12033 1726867180.28432: in run() - task 0affcac9-a3a5-74bb-502b-000000000505 12033 1726867180.28489: variable 'ansible_search_path' from source: unknown 12033 1726867180.28493: variable 'ansible_search_path' from source: unknown 12033 1726867180.28566: calling self._execute() 12033 1726867180.29205: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.29209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.29211: variable 'omit' from source: magic vars 12033 1726867180.30290: variable 'ansible_distribution_major_version' from source: facts 12033 1726867180.30304: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867180.30482: variable 'profile_stat' from source: set_fact 12033 1726867180.30486: Evaluated conditional (profile_stat.stat.exists): False 12033 1726867180.30489: when evaluation is False, skipping this task 12033 1726867180.30491: _execute() done 12033 1726867180.30494: dumping result to json 12033 1726867180.30496: done dumping result, returning 12033 1726867180.30499: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 [0affcac9-a3a5-74bb-502b-000000000505] 12033 1726867180.30501: sending task result for task 0affcac9-a3a5-74bb-502b-000000000505 12033 1726867180.31006: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000505 12033 1726867180.31010: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12033 1726867180.31053: no more pending results, returning what we have 12033 1726867180.31058: results queue empty 12033 1726867180.31059: checking for any_errors_fatal 12033 1726867180.31064: done checking for any_errors_fatal 12033 1726867180.31065: checking for max_fail_percentage 12033 1726867180.31066: done checking for max_fail_percentage 12033 1726867180.31067: checking to see if all hosts have failed and the running result is not ok 12033 1726867180.31068: done checking to see if all hosts have failed 12033 1726867180.31069: getting the remaining hosts for this loop 12033 1726867180.31071: done getting the remaining hosts for this loop 12033 1726867180.31074: getting the next task for host managed_node3 12033 1726867180.31085: done getting next task for host managed_node3 12033 1726867180.31087: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 12033 1726867180.31092: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867180.31096: getting variables 12033 1726867180.31097: in VariableManager get_vars() 12033 1726867180.31124: Calling all_inventory to load vars for managed_node3 12033 1726867180.31127: Calling groups_inventory to load vars for managed_node3 12033 1726867180.31129: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867180.31138: Calling all_plugins_play to load vars for managed_node3 12033 1726867180.31141: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867180.31143: Calling groups_plugins_play to load vars for managed_node3 12033 1726867180.33664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867180.37787: done with get_vars() 12033 1726867180.37816: done getting variables 12033 1726867180.37876: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867180.38309: variable 'profile' from source: include params 12033 1726867180.38314: variable 'bond_port_profile' from source: include params 12033 1726867180.38375: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 17:19:40 -0400 (0:00:00.115) 0:00:19.499 ****** 12033 1726867180.38409: entering _queue_task() for managed_node3/assert 12033 1726867180.39509: worker is 1 (out of 1 available) 12033 1726867180.39519: exiting _queue_task() for managed_node3/assert 12033 1726867180.39529: done queuing things up, now waiting for results queue to drain 12033 1726867180.39530: waiting for pending results... 12033 1726867180.39674: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' 12033 1726867180.40137: in run() - task 0affcac9-a3a5-74bb-502b-0000000004da 12033 1726867180.40142: variable 'ansible_search_path' from source: unknown 12033 1726867180.40145: variable 'ansible_search_path' from source: unknown 12033 1726867180.40147: calling self._execute() 12033 1726867180.40303: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.40464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.40468: variable 'omit' from source: magic vars 12033 1726867180.41440: variable 'ansible_distribution_major_version' from source: facts 12033 1726867180.41570: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867180.41592: variable 'omit' from source: magic vars 12033 1726867180.41636: variable 'omit' from source: magic vars 12033 1726867180.41856: variable 'profile' from source: include params 12033 1726867180.41861: variable 'bond_port_profile' from source: include params 12033 1726867180.42064: variable 'bond_port_profile' from source: include params 12033 1726867180.42084: variable 'omit' from source: magic vars 12033 1726867180.42167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867180.42385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867180.42390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867180.42393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867180.42395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867180.42583: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867180.42586: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.42589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.42700: Set connection var ansible_pipelining to False 12033 1726867180.42711: Set connection var ansible_shell_executable to /bin/sh 12033 1726867180.42719: Set connection var ansible_timeout to 10 12033 1726867180.42731: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867180.42734: Set connection var ansible_connection to ssh 12033 1726867180.42736: Set connection var ansible_shell_type to sh 12033 1726867180.42874: variable 'ansible_shell_executable' from source: unknown 12033 1726867180.42880: variable 'ansible_connection' from source: unknown 12033 1726867180.42882: variable 'ansible_module_compression' from source: unknown 12033 1726867180.42885: variable 'ansible_shell_type' from source: unknown 12033 1726867180.42887: variable 'ansible_shell_executable' from source: unknown 12033 1726867180.42893: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.42898: variable 'ansible_pipelining' from source: unknown 12033 1726867180.42901: variable 'ansible_timeout' from source: unknown 12033 1726867180.42904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.43165: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867180.43183: variable 'omit' from source: magic vars 12033 1726867180.43186: starting attempt loop 12033 1726867180.43188: running the handler 12033 1726867180.43526: variable 'lsr_net_profile_exists' from source: set_fact 12033 1726867180.43529: Evaluated conditional (lsr_net_profile_exists): True 12033 1726867180.43537: handler run complete 12033 1726867180.43551: attempt loop complete, returning result 12033 1726867180.43554: _execute() done 12033 1726867180.43556: dumping result to json 12033 1726867180.43779: done dumping result, returning 12033 1726867180.43783: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' [0affcac9-a3a5-74bb-502b-0000000004da] 12033 1726867180.43785: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004da ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12033 1726867180.43971: no more pending results, returning what we have 12033 1726867180.43975: results queue empty 12033 1726867180.43975: checking for any_errors_fatal 12033 1726867180.43986: done checking for any_errors_fatal 12033 1726867180.43987: checking for max_fail_percentage 12033 1726867180.43989: done checking for max_fail_percentage 12033 1726867180.43990: checking to see if all hosts have failed and the running result is not ok 12033 1726867180.43991: done checking to see if all hosts have failed 12033 1726867180.43992: getting the remaining hosts for this loop 12033 1726867180.43993: done getting the remaining hosts for this loop 12033 1726867180.43996: getting the next task for host managed_node3 12033 1726867180.44004: done getting next task for host managed_node3 12033 1726867180.44006: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 12033 1726867180.44011: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867180.44014: getting variables 12033 1726867180.44015: in VariableManager get_vars() 12033 1726867180.44159: Calling all_inventory to load vars for managed_node3 12033 1726867180.44162: Calling groups_inventory to load vars for managed_node3 12033 1726867180.44166: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867180.44254: Calling all_plugins_play to load vars for managed_node3 12033 1726867180.44259: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867180.44262: Calling groups_plugins_play to load vars for managed_node3 12033 1726867180.44884: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004da 12033 1726867180.44887: WORKER PROCESS EXITING 12033 1726867180.47310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867180.51049: done with get_vars() 12033 1726867180.51288: done getting variables 12033 1726867180.51351: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867180.51488: variable 'profile' from source: include params 12033 1726867180.51493: variable 'bond_port_profile' from source: include params 12033 1726867180.51553: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 17:19:40 -0400 (0:00:00.131) 0:00:19.631 ****** 12033 1726867180.51598: entering _queue_task() for managed_node3/assert 12033 1726867180.52132: worker is 1 (out of 1 available) 12033 1726867180.52144: exiting _queue_task() for managed_node3/assert 12033 1726867180.52155: done queuing things up, now waiting for results queue to drain 12033 1726867180.52157: waiting for pending results... 12033 1726867180.52303: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' 12033 1726867180.52429: in run() - task 0affcac9-a3a5-74bb-502b-0000000004db 12033 1726867180.52454: variable 'ansible_search_path' from source: unknown 12033 1726867180.52463: variable 'ansible_search_path' from source: unknown 12033 1726867180.52514: calling self._execute() 12033 1726867180.52618: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.52629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.52640: variable 'omit' from source: magic vars 12033 1726867180.53022: variable 'ansible_distribution_major_version' from source: facts 12033 1726867180.53041: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867180.53050: variable 'omit' from source: magic vars 12033 1726867180.53113: variable 'omit' from source: magic vars 12033 1726867180.53236: variable 'profile' from source: include params 12033 1726867180.53280: variable 'bond_port_profile' from source: include params 12033 1726867180.53312: variable 'bond_port_profile' from source: include params 12033 1726867180.53379: variable 'omit' from source: magic vars 12033 1726867180.53392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867180.53435: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867180.53507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867180.53511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867180.53513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867180.53539: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867180.53548: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.53560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.53671: Set connection var ansible_pipelining to False 12033 1726867180.53688: Set connection var ansible_shell_executable to /bin/sh 12033 1726867180.53703: Set connection var ansible_timeout to 10 12033 1726867180.53724: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867180.53727: Set connection var ansible_connection to ssh 12033 1726867180.53730: Set connection var ansible_shell_type to sh 12033 1726867180.53780: variable 'ansible_shell_executable' from source: unknown 12033 1726867180.53783: variable 'ansible_connection' from source: unknown 12033 1726867180.53785: variable 'ansible_module_compression' from source: unknown 12033 1726867180.53787: variable 'ansible_shell_type' from source: unknown 12033 1726867180.53789: variable 'ansible_shell_executable' from source: unknown 12033 1726867180.53791: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.53793: variable 'ansible_pipelining' from source: unknown 12033 1726867180.53795: variable 'ansible_timeout' from source: unknown 12033 1726867180.53834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.53957: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867180.53974: variable 'omit' from source: magic vars 12033 1726867180.53991: starting attempt loop 12033 1726867180.54000: running the handler 12033 1726867180.54160: variable 'lsr_net_profile_ansible_managed' from source: set_fact 12033 1726867180.54163: Evaluated conditional (lsr_net_profile_ansible_managed): True 12033 1726867180.54165: handler run complete 12033 1726867180.54167: attempt loop complete, returning result 12033 1726867180.54169: _execute() done 12033 1726867180.54171: dumping result to json 12033 1726867180.54173: done dumping result, returning 12033 1726867180.54175: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' [0affcac9-a3a5-74bb-502b-0000000004db] 12033 1726867180.54179: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004db ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12033 1726867180.54328: no more pending results, returning what we have 12033 1726867180.54332: results queue empty 12033 1726867180.54334: checking for any_errors_fatal 12033 1726867180.54341: done checking for any_errors_fatal 12033 1726867180.54342: checking for max_fail_percentage 12033 1726867180.54345: done checking for max_fail_percentage 12033 1726867180.54346: checking to see if all hosts have failed and the running result is not ok 12033 1726867180.54347: done checking to see if all hosts have failed 12033 1726867180.54347: getting the remaining hosts for this loop 12033 1726867180.54349: done getting the remaining hosts for this loop 12033 1726867180.54352: getting the next task for host managed_node3 12033 1726867180.54360: done getting next task for host managed_node3 12033 1726867180.54363: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 12033 1726867180.54369: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867180.54372: getting variables 12033 1726867180.54374: in VariableManager get_vars() 12033 1726867180.54410: Calling all_inventory to load vars for managed_node3 12033 1726867180.54413: Calling groups_inventory to load vars for managed_node3 12033 1726867180.54417: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867180.54429: Calling all_plugins_play to load vars for managed_node3 12033 1726867180.54432: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867180.54436: Calling groups_plugins_play to load vars for managed_node3 12033 1726867180.55206: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004db 12033 1726867180.55210: WORKER PROCESS EXITING 12033 1726867180.56131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867180.57694: done with get_vars() 12033 1726867180.57718: done getting variables 12033 1726867180.57781: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867180.57896: variable 'profile' from source: include params 12033 1726867180.57900: variable 'bond_port_profile' from source: include params 12033 1726867180.57966: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 17:19:40 -0400 (0:00:00.064) 0:00:19.695 ****** 12033 1726867180.58001: entering _queue_task() for managed_node3/assert 12033 1726867180.58343: worker is 1 (out of 1 available) 12033 1726867180.58354: exiting _queue_task() for managed_node3/assert 12033 1726867180.58368: done queuing things up, now waiting for results queue to drain 12033 1726867180.58370: waiting for pending results... 12033 1726867180.58660: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 12033 1726867180.58807: in run() - task 0affcac9-a3a5-74bb-502b-0000000004dc 12033 1726867180.58828: variable 'ansible_search_path' from source: unknown 12033 1726867180.58837: variable 'ansible_search_path' from source: unknown 12033 1726867180.58875: calling self._execute() 12033 1726867180.58974: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.58988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.59003: variable 'omit' from source: magic vars 12033 1726867180.59371: variable 'ansible_distribution_major_version' from source: facts 12033 1726867180.59391: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867180.59403: variable 'omit' from source: magic vars 12033 1726867180.59463: variable 'omit' from source: magic vars 12033 1726867180.59571: variable 'profile' from source: include params 12033 1726867180.59582: variable 'bond_port_profile' from source: include params 12033 1726867180.59649: variable 'bond_port_profile' from source: include params 12033 1726867180.59682: variable 'omit' from source: magic vars 12033 1726867180.59724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867180.59763: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867180.59795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867180.59892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867180.59895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867180.59898: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867180.59900: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.59902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.59980: Set connection var ansible_pipelining to False 12033 1726867180.59994: Set connection var ansible_shell_executable to /bin/sh 12033 1726867180.60012: Set connection var ansible_timeout to 10 12033 1726867180.60021: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867180.60029: Set connection var ansible_connection to ssh 12033 1726867180.60040: Set connection var ansible_shell_type to sh 12033 1726867180.60064: variable 'ansible_shell_executable' from source: unknown 12033 1726867180.60072: variable 'ansible_connection' from source: unknown 12033 1726867180.60081: variable 'ansible_module_compression' from source: unknown 12033 1726867180.60087: variable 'ansible_shell_type' from source: unknown 12033 1726867180.60092: variable 'ansible_shell_executable' from source: unknown 12033 1726867180.60097: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.60103: variable 'ansible_pipelining' from source: unknown 12033 1726867180.60112: variable 'ansible_timeout' from source: unknown 12033 1726867180.60120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.60329: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867180.60333: variable 'omit' from source: magic vars 12033 1726867180.60335: starting attempt loop 12033 1726867180.60337: running the handler 12033 1726867180.60405: variable 'lsr_net_profile_fingerprint' from source: set_fact 12033 1726867180.60413: Evaluated conditional (lsr_net_profile_fingerprint): True 12033 1726867180.60421: handler run complete 12033 1726867180.60440: attempt loop complete, returning result 12033 1726867180.60446: _execute() done 12033 1726867180.60450: dumping result to json 12033 1726867180.60456: done dumping result, returning 12033 1726867180.60464: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 [0affcac9-a3a5-74bb-502b-0000000004dc] 12033 1726867180.60471: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004dc 12033 1726867180.60826: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004dc 12033 1726867180.60829: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12033 1726867180.60913: no more pending results, returning what we have 12033 1726867180.60918: results queue empty 12033 1726867180.60919: checking for any_errors_fatal 12033 1726867180.60928: done checking for any_errors_fatal 12033 1726867180.60928: checking for max_fail_percentage 12033 1726867180.60931: done checking for max_fail_percentage 12033 1726867180.60932: checking to see if all hosts have failed and the running result is not ok 12033 1726867180.60932: done checking to see if all hosts have failed 12033 1726867180.60933: getting the remaining hosts for this loop 12033 1726867180.60935: done getting the remaining hosts for this loop 12033 1726867180.60938: getting the next task for host managed_node3 12033 1726867180.60949: done getting next task for host managed_node3 12033 1726867180.60952: ^ task is: TASK: Include the task 'get_profile_stat.yml' 12033 1726867180.60957: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867180.60961: getting variables 12033 1726867180.60963: in VariableManager get_vars() 12033 1726867180.60995: Calling all_inventory to load vars for managed_node3 12033 1726867180.60998: Calling groups_inventory to load vars for managed_node3 12033 1726867180.61002: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867180.61013: Calling all_plugins_play to load vars for managed_node3 12033 1726867180.61016: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867180.61019: Calling groups_plugins_play to load vars for managed_node3 12033 1726867180.70762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867180.72661: done with get_vars() 12033 1726867180.72887: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 17:19:40 -0400 (0:00:00.149) 0:00:19.845 ****** 12033 1726867180.72968: entering _queue_task() for managed_node3/include_tasks 12033 1726867180.73707: worker is 1 (out of 1 available) 12033 1726867180.73718: exiting _queue_task() for managed_node3/include_tasks 12033 1726867180.73729: done queuing things up, now waiting for results queue to drain 12033 1726867180.73731: waiting for pending results... 12033 1726867180.74080: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 12033 1726867180.74385: in run() - task 0affcac9-a3a5-74bb-502b-0000000004e0 12033 1726867180.74391: variable 'ansible_search_path' from source: unknown 12033 1726867180.74394: variable 'ansible_search_path' from source: unknown 12033 1726867180.74494: calling self._execute() 12033 1726867180.74620: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.74624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.74699: variable 'omit' from source: magic vars 12033 1726867180.75331: variable 'ansible_distribution_major_version' from source: facts 12033 1726867180.75342: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867180.75348: _execute() done 12033 1726867180.75351: dumping result to json 12033 1726867180.75360: done dumping result, returning 12033 1726867180.75367: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-74bb-502b-0000000004e0] 12033 1726867180.75372: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004e0 12033 1726867180.75718: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004e0 12033 1726867180.75720: WORKER PROCESS EXITING 12033 1726867180.75742: no more pending results, returning what we have 12033 1726867180.75746: in VariableManager get_vars() 12033 1726867180.75774: Calling all_inventory to load vars for managed_node3 12033 1726867180.75776: Calling groups_inventory to load vars for managed_node3 12033 1726867180.75782: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867180.75790: Calling all_plugins_play to load vars for managed_node3 12033 1726867180.75793: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867180.75796: Calling groups_plugins_play to load vars for managed_node3 12033 1726867180.77943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867180.80794: done with get_vars() 12033 1726867180.80818: variable 'ansible_search_path' from source: unknown 12033 1726867180.80820: variable 'ansible_search_path' from source: unknown 12033 1726867180.80856: we have included files to process 12033 1726867180.80857: generating all_blocks data 12033 1726867180.80859: done generating all_blocks data 12033 1726867180.80863: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12033 1726867180.80864: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12033 1726867180.80866: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12033 1726867180.82939: done processing included file 12033 1726867180.82941: iterating over new_blocks loaded from include file 12033 1726867180.82943: in VariableManager get_vars() 12033 1726867180.82960: done with get_vars() 12033 1726867180.82962: filtering new block on tags 12033 1726867180.83262: done filtering new block on tags 12033 1726867180.83265: in VariableManager get_vars() 12033 1726867180.83283: done with get_vars() 12033 1726867180.83284: filtering new block on tags 12033 1726867180.83554: done filtering new block on tags 12033 1726867180.83557: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 12033 1726867180.83563: extending task lists for all hosts with included blocks 12033 1726867180.84756: done extending task lists 12033 1726867180.84757: done processing included files 12033 1726867180.84758: results queue empty 12033 1726867180.84759: checking for any_errors_fatal 12033 1726867180.84763: done checking for any_errors_fatal 12033 1726867180.84764: checking for max_fail_percentage 12033 1726867180.84765: done checking for max_fail_percentage 12033 1726867180.84766: checking to see if all hosts have failed and the running result is not ok 12033 1726867180.84767: done checking to see if all hosts have failed 12033 1726867180.84768: getting the remaining hosts for this loop 12033 1726867180.84769: done getting the remaining hosts for this loop 12033 1726867180.84771: getting the next task for host managed_node3 12033 1726867180.84776: done getting next task for host managed_node3 12033 1726867180.84781: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 12033 1726867180.84784: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867180.84787: getting variables 12033 1726867180.84788: in VariableManager get_vars() 12033 1726867180.84799: Calling all_inventory to load vars for managed_node3 12033 1726867180.84802: Calling groups_inventory to load vars for managed_node3 12033 1726867180.84804: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867180.84810: Calling all_plugins_play to load vars for managed_node3 12033 1726867180.84812: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867180.84816: Calling groups_plugins_play to load vars for managed_node3 12033 1726867180.87065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867180.89158: done with get_vars() 12033 1726867180.89179: done getting variables 12033 1726867180.89219: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:19:40 -0400 (0:00:00.162) 0:00:20.008 ****** 12033 1726867180.89386: entering _queue_task() for managed_node3/set_fact 12033 1726867180.90056: worker is 1 (out of 1 available) 12033 1726867180.90068: exiting _queue_task() for managed_node3/set_fact 12033 1726867180.90082: done queuing things up, now waiting for results queue to drain 12033 1726867180.90084: waiting for pending results... 12033 1726867180.90552: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 12033 1726867180.90656: in run() - task 0affcac9-a3a5-74bb-502b-000000000558 12033 1726867180.90670: variable 'ansible_search_path' from source: unknown 12033 1726867180.90674: variable 'ansible_search_path' from source: unknown 12033 1726867180.90914: calling self._execute() 12033 1726867180.91002: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.91008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.91086: variable 'omit' from source: magic vars 12033 1726867180.91790: variable 'ansible_distribution_major_version' from source: facts 12033 1726867180.91806: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867180.91812: variable 'omit' from source: magic vars 12033 1726867180.91867: variable 'omit' from source: magic vars 12033 1726867180.92106: variable 'omit' from source: magic vars 12033 1726867180.92140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867180.92173: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867180.92195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867180.92212: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867180.92223: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867180.92249: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867180.92252: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.92256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.92559: Set connection var ansible_pipelining to False 12033 1726867180.92583: Set connection var ansible_shell_executable to /bin/sh 12033 1726867180.92586: Set connection var ansible_timeout to 10 12033 1726867180.92589: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867180.92592: Set connection var ansible_connection to ssh 12033 1726867180.92594: Set connection var ansible_shell_type to sh 12033 1726867180.92683: variable 'ansible_shell_executable' from source: unknown 12033 1726867180.92686: variable 'ansible_connection' from source: unknown 12033 1726867180.92689: variable 'ansible_module_compression' from source: unknown 12033 1726867180.92691: variable 'ansible_shell_type' from source: unknown 12033 1726867180.92693: variable 'ansible_shell_executable' from source: unknown 12033 1726867180.92695: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.92697: variable 'ansible_pipelining' from source: unknown 12033 1726867180.92699: variable 'ansible_timeout' from source: unknown 12033 1726867180.92701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.92972: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867180.92984: variable 'omit' from source: magic vars 12033 1726867180.92989: starting attempt loop 12033 1726867180.92995: running the handler 12033 1726867180.93008: handler run complete 12033 1726867180.93037: attempt loop complete, returning result 12033 1726867180.93040: _execute() done 12033 1726867180.93042: dumping result to json 12033 1726867180.93044: done dumping result, returning 12033 1726867180.93047: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-74bb-502b-000000000558] 12033 1726867180.93049: sending task result for task 0affcac9-a3a5-74bb-502b-000000000558 12033 1726867180.93124: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000558 12033 1726867180.93127: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 12033 1726867180.93192: no more pending results, returning what we have 12033 1726867180.93196: results queue empty 12033 1726867180.93197: checking for any_errors_fatal 12033 1726867180.93199: done checking for any_errors_fatal 12033 1726867180.93199: checking for max_fail_percentage 12033 1726867180.93201: done checking for max_fail_percentage 12033 1726867180.93202: checking to see if all hosts have failed and the running result is not ok 12033 1726867180.93203: done checking to see if all hosts have failed 12033 1726867180.93203: getting the remaining hosts for this loop 12033 1726867180.93205: done getting the remaining hosts for this loop 12033 1726867180.93208: getting the next task for host managed_node3 12033 1726867180.93218: done getting next task for host managed_node3 12033 1726867180.93220: ^ task is: TASK: Stat profile file 12033 1726867180.93227: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867180.93231: getting variables 12033 1726867180.93232: in VariableManager get_vars() 12033 1726867180.93263: Calling all_inventory to load vars for managed_node3 12033 1726867180.93265: Calling groups_inventory to load vars for managed_node3 12033 1726867180.93268: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867180.93281: Calling all_plugins_play to load vars for managed_node3 12033 1726867180.93284: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867180.93287: Calling groups_plugins_play to load vars for managed_node3 12033 1726867180.96313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867180.97922: done with get_vars() 12033 1726867180.97943: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:19:40 -0400 (0:00:00.087) 0:00:20.096 ****** 12033 1726867180.98048: entering _queue_task() for managed_node3/stat 12033 1726867180.98468: worker is 1 (out of 1 available) 12033 1726867180.98481: exiting _queue_task() for managed_node3/stat 12033 1726867180.98491: done queuing things up, now waiting for results queue to drain 12033 1726867180.98493: waiting for pending results... 12033 1726867180.98695: running TaskExecutor() for managed_node3/TASK: Stat profile file 12033 1726867180.98901: in run() - task 0affcac9-a3a5-74bb-502b-000000000559 12033 1726867180.98905: variable 'ansible_search_path' from source: unknown 12033 1726867180.98908: variable 'ansible_search_path' from source: unknown 12033 1726867180.98912: calling self._execute() 12033 1726867180.98980: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.98994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867180.99014: variable 'omit' from source: magic vars 12033 1726867180.99408: variable 'ansible_distribution_major_version' from source: facts 12033 1726867180.99428: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867180.99445: variable 'omit' from source: magic vars 12033 1726867180.99517: variable 'omit' from source: magic vars 12033 1726867180.99621: variable 'profile' from source: include params 12033 1726867180.99631: variable 'bond_port_profile' from source: include params 12033 1726867180.99704: variable 'bond_port_profile' from source: include params 12033 1726867180.99770: variable 'omit' from source: magic vars 12033 1726867180.99776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867180.99827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867180.99851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867180.99879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867180.99897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867180.99985: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867180.99989: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867180.99991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867181.00053: Set connection var ansible_pipelining to False 12033 1726867181.00069: Set connection var ansible_shell_executable to /bin/sh 12033 1726867181.00090: Set connection var ansible_timeout to 10 12033 1726867181.00103: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867181.00111: Set connection var ansible_connection to ssh 12033 1726867181.00123: Set connection var ansible_shell_type to sh 12033 1726867181.00147: variable 'ansible_shell_executable' from source: unknown 12033 1726867181.00168: variable 'ansible_connection' from source: unknown 12033 1726867181.00171: variable 'ansible_module_compression' from source: unknown 12033 1726867181.00173: variable 'ansible_shell_type' from source: unknown 12033 1726867181.00176: variable 'ansible_shell_executable' from source: unknown 12033 1726867181.00202: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867181.00205: variable 'ansible_pipelining' from source: unknown 12033 1726867181.00207: variable 'ansible_timeout' from source: unknown 12033 1726867181.00210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867181.00419: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867181.00482: variable 'omit' from source: magic vars 12033 1726867181.00485: starting attempt loop 12033 1726867181.00490: running the handler 12033 1726867181.00492: _low_level_execute_command(): starting 12033 1726867181.00494: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867181.01271: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867181.01563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867181.03289: stdout chunk (state=3): >>>/root <<< 12033 1726867181.03439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867181.03442: stdout chunk (state=3): >>><<< 12033 1726867181.03445: stderr chunk (state=3): >>><<< 12033 1726867181.03495: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867181.03499: _low_level_execute_command(): starting 12033 1726867181.03502: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867181.034657-12968-242261073398858 `" && echo ansible-tmp-1726867181.034657-12968-242261073398858="` echo /root/.ansible/tmp/ansible-tmp-1726867181.034657-12968-242261073398858 `" ) && sleep 0' 12033 1726867181.04865: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867181.04876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867181.04889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867181.04948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867181.04953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867181.04982: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867181.04992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867181.04995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867181.05012: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867181.05057: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867181.05060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867181.05065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867181.05067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867181.05605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867181.05608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867181.05610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867181.07443: stdout chunk (state=3): >>>ansible-tmp-1726867181.034657-12968-242261073398858=/root/.ansible/tmp/ansible-tmp-1726867181.034657-12968-242261073398858 <<< 12033 1726867181.07580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867181.07584: stdout chunk (state=3): >>><<< 12033 1726867181.07592: stderr chunk (state=3): >>><<< 12033 1726867181.07607: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867181.034657-12968-242261073398858=/root/.ansible/tmp/ansible-tmp-1726867181.034657-12968-242261073398858 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867181.07654: variable 'ansible_module_compression' from source: unknown 12033 1726867181.07710: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12033 1726867181.07747: variable 'ansible_facts' from source: unknown 12033 1726867181.07966: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867181.034657-12968-242261073398858/AnsiballZ_stat.py 12033 1726867181.08327: Sending initial data 12033 1726867181.08330: Sent initial data (152 bytes) 12033 1726867181.09695: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867181.09730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867181.09798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867181.09975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867181.10045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867181.11601: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12033 1726867181.11628: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867181.11672: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867181.11721: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpv_wox03k /root/.ansible/tmp/ansible-tmp-1726867181.034657-12968-242261073398858/AnsiballZ_stat.py <<< 12033 1726867181.11806: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867181.034657-12968-242261073398858/AnsiballZ_stat.py" <<< 12033 1726867181.11827: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpv_wox03k" to remote "/root/.ansible/tmp/ansible-tmp-1726867181.034657-12968-242261073398858/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867181.034657-12968-242261073398858/AnsiballZ_stat.py" <<< 12033 1726867181.13063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867181.13200: stderr chunk (state=3): >>><<< 12033 1726867181.13203: stdout chunk (state=3): >>><<< 12033 1726867181.13206: done transferring module to remote 12033 1726867181.13208: _low_level_execute_command(): starting 12033 1726867181.13210: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867181.034657-12968-242261073398858/ /root/.ansible/tmp/ansible-tmp-1726867181.034657-12968-242261073398858/AnsiballZ_stat.py && sleep 0' 12033 1726867181.14507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867181.14524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867181.14571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867181.14705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867181.14761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867181.16518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867181.16634: stderr chunk (state=3): >>><<< 12033 1726867181.16638: stdout chunk (state=3): >>><<< 12033 1726867181.16640: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867181.16643: _low_level_execute_command(): starting 12033 1726867181.16645: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867181.034657-12968-242261073398858/AnsiballZ_stat.py && sleep 0' 12033 1726867181.17726: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867181.17895: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867181.17995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867181.18062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867181.18403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867181.18483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867181.33675: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12033 1726867181.35108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867181.35112: stdout chunk (state=3): >>><<< 12033 1726867181.35114: stderr chunk (state=3): >>><<< 12033 1726867181.35139: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867181.35252: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867181.034657-12968-242261073398858/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867181.35256: _low_level_execute_command(): starting 12033 1726867181.35259: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867181.034657-12968-242261073398858/ > /dev/null 2>&1 && sleep 0' 12033 1726867181.35911: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867181.35927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867181.35941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867181.35960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867181.35978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867181.35994: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867181.36094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867181.36121: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867181.36135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867181.36222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867181.38199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867181.38203: stdout chunk (state=3): >>><<< 12033 1726867181.38205: stderr chunk (state=3): >>><<< 12033 1726867181.38219: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867181.38244: handler run complete 12033 1726867181.38271: attempt loop complete, returning result 12033 1726867181.38287: _execute() done 12033 1726867181.38299: dumping result to json 12033 1726867181.38313: done dumping result, returning 12033 1726867181.38327: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcac9-a3a5-74bb-502b-000000000559] 12033 1726867181.38337: sending task result for task 0affcac9-a3a5-74bb-502b-000000000559 ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 12033 1726867181.38537: no more pending results, returning what we have 12033 1726867181.38540: results queue empty 12033 1726867181.38541: checking for any_errors_fatal 12033 1726867181.38548: done checking for any_errors_fatal 12033 1726867181.38548: checking for max_fail_percentage 12033 1726867181.38550: done checking for max_fail_percentage 12033 1726867181.38551: checking to see if all hosts have failed and the running result is not ok 12033 1726867181.38552: done checking to see if all hosts have failed 12033 1726867181.38552: getting the remaining hosts for this loop 12033 1726867181.38554: done getting the remaining hosts for this loop 12033 1726867181.38557: getting the next task for host managed_node3 12033 1726867181.38564: done getting next task for host managed_node3 12033 1726867181.38568: ^ task is: TASK: Set NM profile exist flag based on the profile files 12033 1726867181.38575: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867181.38581: getting variables 12033 1726867181.38582: in VariableManager get_vars() 12033 1726867181.38613: Calling all_inventory to load vars for managed_node3 12033 1726867181.38615: Calling groups_inventory to load vars for managed_node3 12033 1726867181.38618: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867181.38630: Calling all_plugins_play to load vars for managed_node3 12033 1726867181.38633: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867181.38636: Calling groups_plugins_play to load vars for managed_node3 12033 1726867181.39158: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000559 12033 1726867181.39162: WORKER PROCESS EXITING 12033 1726867181.40112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867181.42496: done with get_vars() 12033 1726867181.42518: done getting variables 12033 1726867181.42637: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:19:41 -0400 (0:00:00.446) 0:00:20.542 ****** 12033 1726867181.42675: entering _queue_task() for managed_node3/set_fact 12033 1726867181.43039: worker is 1 (out of 1 available) 12033 1726867181.43050: exiting _queue_task() for managed_node3/set_fact 12033 1726867181.43063: done queuing things up, now waiting for results queue to drain 12033 1726867181.43064: waiting for pending results... 12033 1726867181.43348: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 12033 1726867181.43505: in run() - task 0affcac9-a3a5-74bb-502b-00000000055a 12033 1726867181.43513: variable 'ansible_search_path' from source: unknown 12033 1726867181.43522: variable 'ansible_search_path' from source: unknown 12033 1726867181.43685: calling self._execute() 12033 1726867181.43692: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867181.43696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867181.43699: variable 'omit' from source: magic vars 12033 1726867181.44092: variable 'ansible_distribution_major_version' from source: facts 12033 1726867181.44111: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867181.44288: variable 'profile_stat' from source: set_fact 12033 1726867181.44308: Evaluated conditional (profile_stat.stat.exists): False 12033 1726867181.44315: when evaluation is False, skipping this task 12033 1726867181.44322: _execute() done 12033 1726867181.44329: dumping result to json 12033 1726867181.44336: done dumping result, returning 12033 1726867181.44345: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-74bb-502b-00000000055a] 12033 1726867181.44584: sending task result for task 0affcac9-a3a5-74bb-502b-00000000055a 12033 1726867181.44656: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000055a 12033 1726867181.44659: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12033 1726867181.44716: no more pending results, returning what we have 12033 1726867181.44721: results queue empty 12033 1726867181.44723: checking for any_errors_fatal 12033 1726867181.44730: done checking for any_errors_fatal 12033 1726867181.44731: checking for max_fail_percentage 12033 1726867181.44733: done checking for max_fail_percentage 12033 1726867181.44734: checking to see if all hosts have failed and the running result is not ok 12033 1726867181.44735: done checking to see if all hosts have failed 12033 1726867181.44736: getting the remaining hosts for this loop 12033 1726867181.44738: done getting the remaining hosts for this loop 12033 1726867181.44741: getting the next task for host managed_node3 12033 1726867181.44750: done getting next task for host managed_node3 12033 1726867181.44752: ^ task is: TASK: Get NM profile info 12033 1726867181.44759: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867181.44764: getting variables 12033 1726867181.44765: in VariableManager get_vars() 12033 1726867181.44803: Calling all_inventory to load vars for managed_node3 12033 1726867181.44806: Calling groups_inventory to load vars for managed_node3 12033 1726867181.44809: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867181.44823: Calling all_plugins_play to load vars for managed_node3 12033 1726867181.44827: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867181.44831: Calling groups_plugins_play to load vars for managed_node3 12033 1726867181.47653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867181.51531: done with get_vars() 12033 1726867181.51557: done getting variables 12033 1726867181.51622: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:19:41 -0400 (0:00:00.089) 0:00:20.632 ****** 12033 1726867181.51660: entering _queue_task() for managed_node3/shell 12033 1726867181.52114: worker is 1 (out of 1 available) 12033 1726867181.52124: exiting _queue_task() for managed_node3/shell 12033 1726867181.52134: done queuing things up, now waiting for results queue to drain 12033 1726867181.52136: waiting for pending results... 12033 1726867181.52599: running TaskExecutor() for managed_node3/TASK: Get NM profile info 12033 1726867181.52609: in run() - task 0affcac9-a3a5-74bb-502b-00000000055b 12033 1726867181.52615: variable 'ansible_search_path' from source: unknown 12033 1726867181.52618: variable 'ansible_search_path' from source: unknown 12033 1726867181.52642: calling self._execute() 12033 1726867181.52737: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867181.52745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867181.52754: variable 'omit' from source: magic vars 12033 1726867181.53705: variable 'ansible_distribution_major_version' from source: facts 12033 1726867181.53826: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867181.53830: variable 'omit' from source: magic vars 12033 1726867181.53922: variable 'omit' from source: magic vars 12033 1726867181.54140: variable 'profile' from source: include params 12033 1726867181.54143: variable 'bond_port_profile' from source: include params 12033 1726867181.54266: variable 'bond_port_profile' from source: include params 12033 1726867181.54289: variable 'omit' from source: magic vars 12033 1726867181.54472: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867181.54544: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867181.54548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867181.54583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867181.54629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867181.54685: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867181.54688: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867181.54691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867181.54872: Set connection var ansible_pipelining to False 12033 1726867181.54901: Set connection var ansible_shell_executable to /bin/sh 12033 1726867181.54905: Set connection var ansible_timeout to 10 12033 1726867181.54907: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867181.54909: Set connection var ansible_connection to ssh 12033 1726867181.54918: Set connection var ansible_shell_type to sh 12033 1726867181.54983: variable 'ansible_shell_executable' from source: unknown 12033 1726867181.54987: variable 'ansible_connection' from source: unknown 12033 1726867181.54989: variable 'ansible_module_compression' from source: unknown 12033 1726867181.54991: variable 'ansible_shell_type' from source: unknown 12033 1726867181.54993: variable 'ansible_shell_executable' from source: unknown 12033 1726867181.54995: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867181.54997: variable 'ansible_pipelining' from source: unknown 12033 1726867181.54999: variable 'ansible_timeout' from source: unknown 12033 1726867181.55001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867181.55098: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867181.55109: variable 'omit' from source: magic vars 12033 1726867181.55117: starting attempt loop 12033 1726867181.55120: running the handler 12033 1726867181.55126: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867181.55147: _low_level_execute_command(): starting 12033 1726867181.55184: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867181.55841: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867181.55857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867181.55868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867181.55893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867181.55985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867181.55988: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867181.55990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867181.55992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867181.56003: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867181.56027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867181.56043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867181.56061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867181.56138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867181.57823: stdout chunk (state=3): >>>/root <<< 12033 1726867181.57954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867181.57971: stdout chunk (state=3): >>><<< 12033 1726867181.57987: stderr chunk (state=3): >>><<< 12033 1726867181.58018: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867181.58106: _low_level_execute_command(): starting 12033 1726867181.58110: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867181.5802412-12999-31991281431290 `" && echo ansible-tmp-1726867181.5802412-12999-31991281431290="` echo /root/.ansible/tmp/ansible-tmp-1726867181.5802412-12999-31991281431290 `" ) && sleep 0' 12033 1726867181.58675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867181.58772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867181.58792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867181.58815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867181.58902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867181.60799: stdout chunk (state=3): >>>ansible-tmp-1726867181.5802412-12999-31991281431290=/root/.ansible/tmp/ansible-tmp-1726867181.5802412-12999-31991281431290 <<< 12033 1726867181.60946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867181.60965: stderr chunk (state=3): >>><<< 12033 1726867181.60974: stdout chunk (state=3): >>><<< 12033 1726867181.61084: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867181.5802412-12999-31991281431290=/root/.ansible/tmp/ansible-tmp-1726867181.5802412-12999-31991281431290 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867181.61087: variable 'ansible_module_compression' from source: unknown 12033 1726867181.61089: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867181.61156: variable 'ansible_facts' from source: unknown 12033 1726867181.61248: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867181.5802412-12999-31991281431290/AnsiballZ_command.py 12033 1726867181.61707: Sending initial data 12033 1726867181.61710: Sent initial data (155 bytes) 12033 1726867181.62193: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867181.62196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867181.62199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867181.62202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867181.62204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867181.62249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867181.62262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867181.62310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867181.63868: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867181.63935: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867181.63986: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmplgl2_g7g /root/.ansible/tmp/ansible-tmp-1726867181.5802412-12999-31991281431290/AnsiballZ_command.py <<< 12033 1726867181.63996: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867181.5802412-12999-31991281431290/AnsiballZ_command.py" <<< 12033 1726867181.64027: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmplgl2_g7g" to remote "/root/.ansible/tmp/ansible-tmp-1726867181.5802412-12999-31991281431290/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867181.5802412-12999-31991281431290/AnsiballZ_command.py" <<< 12033 1726867181.64735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867181.64774: stderr chunk (state=3): >>><<< 12033 1726867181.64783: stdout chunk (state=3): >>><<< 12033 1726867181.64932: done transferring module to remote 12033 1726867181.64935: _low_level_execute_command(): starting 12033 1726867181.64938: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867181.5802412-12999-31991281431290/ /root/.ansible/tmp/ansible-tmp-1726867181.5802412-12999-31991281431290/AnsiballZ_command.py && sleep 0' 12033 1726867181.65501: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867181.65516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867181.65532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867181.65545: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867181.65610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867181.65672: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867181.65716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867181.65830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867181.67546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867181.67607: stderr chunk (state=3): >>><<< 12033 1726867181.67617: stdout chunk (state=3): >>><<< 12033 1726867181.67638: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867181.67720: _low_level_execute_command(): starting 12033 1726867181.67724: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867181.5802412-12999-31991281431290/AnsiballZ_command.py && sleep 0' 12033 1726867181.68256: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867181.68268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867181.68282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867181.68299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867181.68322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867181.68332: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867181.68342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867181.68357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867181.68433: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867181.68456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867181.68472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867181.68494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867181.68564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867181.86881: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 17:19:41.841925", "end": "2024-09-20 17:19:41.863064", "delta": "0:00:00.021139", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867181.88254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867181.88258: stdout chunk (state=3): >>><<< 12033 1726867181.88264: stderr chunk (state=3): >>><<< 12033 1726867181.88284: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 17:19:41.841925", "end": "2024-09-20 17:19:41.863064", "delta": "0:00:00.021139", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867181.88322: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867181.5802412-12999-31991281431290/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867181.88329: _low_level_execute_command(): starting 12033 1726867181.88334: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867181.5802412-12999-31991281431290/ > /dev/null 2>&1 && sleep 0' 12033 1726867181.88975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867181.88988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867181.89016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867181.89176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867181.89182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867181.89185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867181.89187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867181.89233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867181.91466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867181.91469: stderr chunk (state=3): >>><<< 12033 1726867181.91471: stdout chunk (state=3): >>><<< 12033 1726867181.91473: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867181.91475: handler run complete 12033 1726867181.91476: Evaluated conditional (False): False 12033 1726867181.91483: attempt loop complete, returning result 12033 1726867181.91781: _execute() done 12033 1726867181.91785: dumping result to json 12033 1726867181.91788: done dumping result, returning 12033 1726867181.91790: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcac9-a3a5-74bb-502b-00000000055b] 12033 1726867181.91792: sending task result for task 0affcac9-a3a5-74bb-502b-00000000055b 12033 1726867181.91861: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000055b 12033 1726867181.91864: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.021139", "end": "2024-09-20 17:19:41.863064", "rc": 0, "start": "2024-09-20 17:19:41.841925" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 12033 1726867181.91937: no more pending results, returning what we have 12033 1726867181.91941: results queue empty 12033 1726867181.91942: checking for any_errors_fatal 12033 1726867181.91950: done checking for any_errors_fatal 12033 1726867181.91951: checking for max_fail_percentage 12033 1726867181.91953: done checking for max_fail_percentage 12033 1726867181.91954: checking to see if all hosts have failed and the running result is not ok 12033 1726867181.91955: done checking to see if all hosts have failed 12033 1726867181.91956: getting the remaining hosts for this loop 12033 1726867181.91958: done getting the remaining hosts for this loop 12033 1726867181.91961: getting the next task for host managed_node3 12033 1726867181.91970: done getting next task for host managed_node3 12033 1726867181.91972: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12033 1726867181.91980: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867181.91983: getting variables 12033 1726867181.91985: in VariableManager get_vars() 12033 1726867181.92017: Calling all_inventory to load vars for managed_node3 12033 1726867181.92019: Calling groups_inventory to load vars for managed_node3 12033 1726867181.92023: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867181.92036: Calling all_plugins_play to load vars for managed_node3 12033 1726867181.92039: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867181.92043: Calling groups_plugins_play to load vars for managed_node3 12033 1726867181.94341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867181.97751: done with get_vars() 12033 1726867181.97782: done getting variables 12033 1726867181.98218: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:19:41 -0400 (0:00:00.465) 0:00:21.098 ****** 12033 1726867181.98259: entering _queue_task() for managed_node3/set_fact 12033 1726867181.99018: worker is 1 (out of 1 available) 12033 1726867181.99029: exiting _queue_task() for managed_node3/set_fact 12033 1726867181.99039: done queuing things up, now waiting for results queue to drain 12033 1726867181.99041: waiting for pending results... 12033 1726867181.99281: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12033 1726867181.99635: in run() - task 0affcac9-a3a5-74bb-502b-00000000055c 12033 1726867181.99640: variable 'ansible_search_path' from source: unknown 12033 1726867181.99642: variable 'ansible_search_path' from source: unknown 12033 1726867181.99649: calling self._execute() 12033 1726867181.99966: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867181.99970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867181.99981: variable 'omit' from source: magic vars 12033 1726867182.00789: variable 'ansible_distribution_major_version' from source: facts 12033 1726867182.00804: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867182.01041: variable 'nm_profile_exists' from source: set_fact 12033 1726867182.01056: Evaluated conditional (nm_profile_exists.rc == 0): True 12033 1726867182.01061: variable 'omit' from source: magic vars 12033 1726867182.01152: variable 'omit' from source: magic vars 12033 1726867182.01185: variable 'omit' from source: magic vars 12033 1726867182.01343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867182.01381: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867182.01457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867182.01478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867182.01556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867182.01587: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867182.01591: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.01596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.01813: Set connection var ansible_pipelining to False 12033 1726867182.01822: Set connection var ansible_shell_executable to /bin/sh 12033 1726867182.01828: Set connection var ansible_timeout to 10 12033 1726867182.01833: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867182.01836: Set connection var ansible_connection to ssh 12033 1726867182.01841: Set connection var ansible_shell_type to sh 12033 1726867182.01860: variable 'ansible_shell_executable' from source: unknown 12033 1726867182.01863: variable 'ansible_connection' from source: unknown 12033 1726867182.01865: variable 'ansible_module_compression' from source: unknown 12033 1726867182.01984: variable 'ansible_shell_type' from source: unknown 12033 1726867182.01988: variable 'ansible_shell_executable' from source: unknown 12033 1726867182.01990: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.01998: variable 'ansible_pipelining' from source: unknown 12033 1726867182.02001: variable 'ansible_timeout' from source: unknown 12033 1726867182.02003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.02243: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867182.02258: variable 'omit' from source: magic vars 12033 1726867182.02261: starting attempt loop 12033 1726867182.02264: running the handler 12033 1726867182.02272: handler run complete 12033 1726867182.02284: attempt loop complete, returning result 12033 1726867182.02287: _execute() done 12033 1726867182.02289: dumping result to json 12033 1726867182.02291: done dumping result, returning 12033 1726867182.02446: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-74bb-502b-00000000055c] 12033 1726867182.02449: sending task result for task 0affcac9-a3a5-74bb-502b-00000000055c 12033 1726867182.02697: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000055c 12033 1726867182.02700: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 12033 1726867182.02758: no more pending results, returning what we have 12033 1726867182.02761: results queue empty 12033 1726867182.02762: checking for any_errors_fatal 12033 1726867182.02769: done checking for any_errors_fatal 12033 1726867182.02770: checking for max_fail_percentage 12033 1726867182.02772: done checking for max_fail_percentage 12033 1726867182.02773: checking to see if all hosts have failed and the running result is not ok 12033 1726867182.02774: done checking to see if all hosts have failed 12033 1726867182.02774: getting the remaining hosts for this loop 12033 1726867182.02776: done getting the remaining hosts for this loop 12033 1726867182.02782: getting the next task for host managed_node3 12033 1726867182.02791: done getting next task for host managed_node3 12033 1726867182.02794: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 12033 1726867182.02800: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867182.02803: getting variables 12033 1726867182.02804: in VariableManager get_vars() 12033 1726867182.02829: Calling all_inventory to load vars for managed_node3 12033 1726867182.02832: Calling groups_inventory to load vars for managed_node3 12033 1726867182.02835: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867182.02845: Calling all_plugins_play to load vars for managed_node3 12033 1726867182.02848: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867182.02851: Calling groups_plugins_play to load vars for managed_node3 12033 1726867182.06580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867182.08530: done with get_vars() 12033 1726867182.08563: done getting variables 12033 1726867182.08630: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867182.08760: variable 'profile' from source: include params 12033 1726867182.08764: variable 'bond_port_profile' from source: include params 12033 1726867182.08827: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:19:42 -0400 (0:00:00.106) 0:00:21.204 ****** 12033 1726867182.08862: entering _queue_task() for managed_node3/command 12033 1726867182.09333: worker is 1 (out of 1 available) 12033 1726867182.09343: exiting _queue_task() for managed_node3/command 12033 1726867182.09353: done queuing things up, now waiting for results queue to drain 12033 1726867182.09354: waiting for pending results... 12033 1726867182.09595: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 12033 1726867182.09686: in run() - task 0affcac9-a3a5-74bb-502b-00000000055e 12033 1726867182.09708: variable 'ansible_search_path' from source: unknown 12033 1726867182.09717: variable 'ansible_search_path' from source: unknown 12033 1726867182.09762: calling self._execute() 12033 1726867182.09870: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.09901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.09905: variable 'omit' from source: magic vars 12033 1726867182.10266: variable 'ansible_distribution_major_version' from source: facts 12033 1726867182.10287: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867182.10442: variable 'profile_stat' from source: set_fact 12033 1726867182.10446: Evaluated conditional (profile_stat.stat.exists): False 12033 1726867182.10448: when evaluation is False, skipping this task 12033 1726867182.10449: _execute() done 12033 1726867182.10451: dumping result to json 12033 1726867182.10453: done dumping result, returning 12033 1726867182.10683: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [0affcac9-a3a5-74bb-502b-00000000055e] 12033 1726867182.10686: sending task result for task 0affcac9-a3a5-74bb-502b-00000000055e 12033 1726867182.10753: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000055e 12033 1726867182.10756: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12033 1726867182.10931: no more pending results, returning what we have 12033 1726867182.10935: results queue empty 12033 1726867182.10936: checking for any_errors_fatal 12033 1726867182.10943: done checking for any_errors_fatal 12033 1726867182.10944: checking for max_fail_percentage 12033 1726867182.10946: done checking for max_fail_percentage 12033 1726867182.10946: checking to see if all hosts have failed and the running result is not ok 12033 1726867182.10947: done checking to see if all hosts have failed 12033 1726867182.10948: getting the remaining hosts for this loop 12033 1726867182.10950: done getting the remaining hosts for this loop 12033 1726867182.10953: getting the next task for host managed_node3 12033 1726867182.10961: done getting next task for host managed_node3 12033 1726867182.10964: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 12033 1726867182.10970: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867182.10974: getting variables 12033 1726867182.10975: in VariableManager get_vars() 12033 1726867182.11013: Calling all_inventory to load vars for managed_node3 12033 1726867182.11016: Calling groups_inventory to load vars for managed_node3 12033 1726867182.11019: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867182.11031: Calling all_plugins_play to load vars for managed_node3 12033 1726867182.11034: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867182.11036: Calling groups_plugins_play to load vars for managed_node3 12033 1726867182.13744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867182.17232: done with get_vars() 12033 1726867182.17254: done getting variables 12033 1726867182.17318: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867182.17432: variable 'profile' from source: include params 12033 1726867182.17436: variable 'bond_port_profile' from source: include params 12033 1726867182.17701: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:19:42 -0400 (0:00:00.088) 0:00:21.293 ****** 12033 1726867182.17735: entering _queue_task() for managed_node3/set_fact 12033 1726867182.18259: worker is 1 (out of 1 available) 12033 1726867182.18272: exiting _queue_task() for managed_node3/set_fact 12033 1726867182.18686: done queuing things up, now waiting for results queue to drain 12033 1726867182.18688: waiting for pending results... 12033 1726867182.18994: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 12033 1726867182.19136: in run() - task 0affcac9-a3a5-74bb-502b-00000000055f 12033 1726867182.19150: variable 'ansible_search_path' from source: unknown 12033 1726867182.19153: variable 'ansible_search_path' from source: unknown 12033 1726867182.19189: calling self._execute() 12033 1726867182.19319: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.19593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.19596: variable 'omit' from source: magic vars 12033 1726867182.20279: variable 'ansible_distribution_major_version' from source: facts 12033 1726867182.20292: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867182.20530: variable 'profile_stat' from source: set_fact 12033 1726867182.20544: Evaluated conditional (profile_stat.stat.exists): False 12033 1726867182.20547: when evaluation is False, skipping this task 12033 1726867182.20549: _execute() done 12033 1726867182.20552: dumping result to json 12033 1726867182.20555: done dumping result, returning 12033 1726867182.20561: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [0affcac9-a3a5-74bb-502b-00000000055f] 12033 1726867182.20583: sending task result for task 0affcac9-a3a5-74bb-502b-00000000055f 12033 1726867182.20771: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000055f 12033 1726867182.20775: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12033 1726867182.20843: no more pending results, returning what we have 12033 1726867182.20848: results queue empty 12033 1726867182.20849: checking for any_errors_fatal 12033 1726867182.20856: done checking for any_errors_fatal 12033 1726867182.20857: checking for max_fail_percentage 12033 1726867182.20859: done checking for max_fail_percentage 12033 1726867182.20860: checking to see if all hosts have failed and the running result is not ok 12033 1726867182.20861: done checking to see if all hosts have failed 12033 1726867182.20862: getting the remaining hosts for this loop 12033 1726867182.20863: done getting the remaining hosts for this loop 12033 1726867182.20866: getting the next task for host managed_node3 12033 1726867182.20876: done getting next task for host managed_node3 12033 1726867182.20939: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 12033 1726867182.20947: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867182.20952: getting variables 12033 1726867182.20953: in VariableManager get_vars() 12033 1726867182.20989: Calling all_inventory to load vars for managed_node3 12033 1726867182.20994: Calling groups_inventory to load vars for managed_node3 12033 1726867182.20998: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867182.21011: Calling all_plugins_play to load vars for managed_node3 12033 1726867182.21015: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867182.21018: Calling groups_plugins_play to load vars for managed_node3 12033 1726867182.22676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867182.25109: done with get_vars() 12033 1726867182.25134: done getting variables 12033 1726867182.25198: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867182.25511: variable 'profile' from source: include params 12033 1726867182.25515: variable 'bond_port_profile' from source: include params 12033 1726867182.25574: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:19:42 -0400 (0:00:00.078) 0:00:21.371 ****** 12033 1726867182.25614: entering _queue_task() for managed_node3/command 12033 1726867182.26323: worker is 1 (out of 1 available) 12033 1726867182.26334: exiting _queue_task() for managed_node3/command 12033 1726867182.26346: done queuing things up, now waiting for results queue to drain 12033 1726867182.26347: waiting for pending results... 12033 1726867182.26906: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 12033 1726867182.27008: in run() - task 0affcac9-a3a5-74bb-502b-000000000560 12033 1726867182.27031: variable 'ansible_search_path' from source: unknown 12033 1726867182.27038: variable 'ansible_search_path' from source: unknown 12033 1726867182.27075: calling self._execute() 12033 1726867182.27342: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.27365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.27382: variable 'omit' from source: magic vars 12033 1726867182.28081: variable 'ansible_distribution_major_version' from source: facts 12033 1726867182.28219: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867182.28314: variable 'profile_stat' from source: set_fact 12033 1726867182.28452: Evaluated conditional (profile_stat.stat.exists): False 12033 1726867182.28461: when evaluation is False, skipping this task 12033 1726867182.28468: _execute() done 12033 1726867182.28474: dumping result to json 12033 1726867182.28484: done dumping result, returning 12033 1726867182.28495: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 [0affcac9-a3a5-74bb-502b-000000000560] 12033 1726867182.28506: sending task result for task 0affcac9-a3a5-74bb-502b-000000000560 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12033 1726867182.28766: no more pending results, returning what we have 12033 1726867182.28770: results queue empty 12033 1726867182.28771: checking for any_errors_fatal 12033 1726867182.28780: done checking for any_errors_fatal 12033 1726867182.28781: checking for max_fail_percentage 12033 1726867182.28784: done checking for max_fail_percentage 12033 1726867182.28785: checking to see if all hosts have failed and the running result is not ok 12033 1726867182.28786: done checking to see if all hosts have failed 12033 1726867182.28787: getting the remaining hosts for this loop 12033 1726867182.28789: done getting the remaining hosts for this loop 12033 1726867182.28795: getting the next task for host managed_node3 12033 1726867182.28804: done getting next task for host managed_node3 12033 1726867182.28806: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 12033 1726867182.28813: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867182.28818: getting variables 12033 1726867182.28820: in VariableManager get_vars() 12033 1726867182.28855: Calling all_inventory to load vars for managed_node3 12033 1726867182.28858: Calling groups_inventory to load vars for managed_node3 12033 1726867182.28862: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867182.28876: Calling all_plugins_play to load vars for managed_node3 12033 1726867182.28885: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867182.28893: Calling groups_plugins_play to load vars for managed_node3 12033 1726867182.29453: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000560 12033 1726867182.29463: WORKER PROCESS EXITING 12033 1726867182.31823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867182.35119: done with get_vars() 12033 1726867182.35142: done getting variables 12033 1726867182.35201: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867182.35485: variable 'profile' from source: include params 12033 1726867182.35492: variable 'bond_port_profile' from source: include params 12033 1726867182.35661: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:19:42 -0400 (0:00:00.100) 0:00:21.472 ****** 12033 1726867182.35697: entering _queue_task() for managed_node3/set_fact 12033 1726867182.36443: worker is 1 (out of 1 available) 12033 1726867182.36455: exiting _queue_task() for managed_node3/set_fact 12033 1726867182.36466: done queuing things up, now waiting for results queue to drain 12033 1726867182.36468: waiting for pending results... 12033 1726867182.36896: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 12033 1726867182.37167: in run() - task 0affcac9-a3a5-74bb-502b-000000000561 12033 1726867182.37244: variable 'ansible_search_path' from source: unknown 12033 1726867182.37269: variable 'ansible_search_path' from source: unknown 12033 1726867182.37322: calling self._execute() 12033 1726867182.37672: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.37676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.37682: variable 'omit' from source: magic vars 12033 1726867182.38428: variable 'ansible_distribution_major_version' from source: facts 12033 1726867182.38446: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867182.38567: variable 'profile_stat' from source: set_fact 12033 1726867182.38585: Evaluated conditional (profile_stat.stat.exists): False 12033 1726867182.38593: when evaluation is False, skipping this task 12033 1726867182.38604: _execute() done 12033 1726867182.38610: dumping result to json 12033 1726867182.38617: done dumping result, returning 12033 1726867182.38627: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [0affcac9-a3a5-74bb-502b-000000000561] 12033 1726867182.38636: sending task result for task 0affcac9-a3a5-74bb-502b-000000000561 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12033 1726867182.38776: no more pending results, returning what we have 12033 1726867182.38782: results queue empty 12033 1726867182.38783: checking for any_errors_fatal 12033 1726867182.38792: done checking for any_errors_fatal 12033 1726867182.38793: checking for max_fail_percentage 12033 1726867182.38795: done checking for max_fail_percentage 12033 1726867182.38796: checking to see if all hosts have failed and the running result is not ok 12033 1726867182.38797: done checking to see if all hosts have failed 12033 1726867182.38797: getting the remaining hosts for this loop 12033 1726867182.38799: done getting the remaining hosts for this loop 12033 1726867182.38802: getting the next task for host managed_node3 12033 1726867182.38811: done getting next task for host managed_node3 12033 1726867182.38814: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 12033 1726867182.38824: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867182.38829: getting variables 12033 1726867182.38830: in VariableManager get_vars() 12033 1726867182.38861: Calling all_inventory to load vars for managed_node3 12033 1726867182.38864: Calling groups_inventory to load vars for managed_node3 12033 1726867182.38867: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867182.38882: Calling all_plugins_play to load vars for managed_node3 12033 1726867182.38885: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867182.38888: Calling groups_plugins_play to load vars for managed_node3 12033 1726867182.39411: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000561 12033 1726867182.39415: WORKER PROCESS EXITING 12033 1726867182.40689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867182.43869: done with get_vars() 12033 1726867182.43899: done getting variables 12033 1726867182.43958: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867182.44276: variable 'profile' from source: include params 12033 1726867182.44282: variable 'bond_port_profile' from source: include params 12033 1726867182.44340: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 17:19:42 -0400 (0:00:00.086) 0:00:21.559 ****** 12033 1726867182.44580: entering _queue_task() for managed_node3/assert 12033 1726867182.45253: worker is 1 (out of 1 available) 12033 1726867182.45266: exiting _queue_task() for managed_node3/assert 12033 1726867182.45340: done queuing things up, now waiting for results queue to drain 12033 1726867182.45342: waiting for pending results... 12033 1726867182.45841: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' 12033 1726867182.45991: in run() - task 0affcac9-a3a5-74bb-502b-0000000004e1 12033 1726867182.46013: variable 'ansible_search_path' from source: unknown 12033 1726867182.46087: variable 'ansible_search_path' from source: unknown 12033 1726867182.46327: calling self._execute() 12033 1726867182.46331: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.46584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.46588: variable 'omit' from source: magic vars 12033 1726867182.47342: variable 'ansible_distribution_major_version' from source: facts 12033 1726867182.47358: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867182.47369: variable 'omit' from source: magic vars 12033 1726867182.47450: variable 'omit' from source: magic vars 12033 1726867182.47600: variable 'profile' from source: include params 12033 1726867182.47618: variable 'bond_port_profile' from source: include params 12033 1726867182.47690: variable 'bond_port_profile' from source: include params 12033 1726867182.47715: variable 'omit' from source: magic vars 12033 1726867182.47768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867182.47810: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867182.47834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867182.47857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867182.47875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867182.47914: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867182.47925: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.47933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.48033: Set connection var ansible_pipelining to False 12033 1726867182.48050: Set connection var ansible_shell_executable to /bin/sh 12033 1726867182.48064: Set connection var ansible_timeout to 10 12033 1726867182.48074: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867182.48085: Set connection var ansible_connection to ssh 12033 1726867182.48096: Set connection var ansible_shell_type to sh 12033 1726867182.48119: variable 'ansible_shell_executable' from source: unknown 12033 1726867182.48125: variable 'ansible_connection' from source: unknown 12033 1726867182.48131: variable 'ansible_module_compression' from source: unknown 12033 1726867182.48140: variable 'ansible_shell_type' from source: unknown 12033 1726867182.48145: variable 'ansible_shell_executable' from source: unknown 12033 1726867182.48151: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.48283: variable 'ansible_pipelining' from source: unknown 12033 1726867182.48287: variable 'ansible_timeout' from source: unknown 12033 1726867182.48290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.48337: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867182.48349: variable 'omit' from source: magic vars 12033 1726867182.48354: starting attempt loop 12033 1726867182.48364: running the handler 12033 1726867182.48498: variable 'lsr_net_profile_exists' from source: set_fact 12033 1726867182.48502: Evaluated conditional (lsr_net_profile_exists): True 12033 1726867182.48509: handler run complete 12033 1726867182.48523: attempt loop complete, returning result 12033 1726867182.48526: _execute() done 12033 1726867182.48529: dumping result to json 12033 1726867182.48531: done dumping result, returning 12033 1726867182.48538: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' [0affcac9-a3a5-74bb-502b-0000000004e1] 12033 1726867182.48543: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004e1 12033 1726867182.48633: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004e1 12033 1726867182.48636: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12033 1726867182.48811: no more pending results, returning what we have 12033 1726867182.48815: results queue empty 12033 1726867182.48817: checking for any_errors_fatal 12033 1726867182.48824: done checking for any_errors_fatal 12033 1726867182.48825: checking for max_fail_percentage 12033 1726867182.48827: done checking for max_fail_percentage 12033 1726867182.48828: checking to see if all hosts have failed and the running result is not ok 12033 1726867182.48829: done checking to see if all hosts have failed 12033 1726867182.48829: getting the remaining hosts for this loop 12033 1726867182.48831: done getting the remaining hosts for this loop 12033 1726867182.48836: getting the next task for host managed_node3 12033 1726867182.48843: done getting next task for host managed_node3 12033 1726867182.48846: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 12033 1726867182.48851: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867182.48856: getting variables 12033 1726867182.48859: in VariableManager get_vars() 12033 1726867182.48894: Calling all_inventory to load vars for managed_node3 12033 1726867182.48898: Calling groups_inventory to load vars for managed_node3 12033 1726867182.48902: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867182.48915: Calling all_plugins_play to load vars for managed_node3 12033 1726867182.48918: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867182.48921: Calling groups_plugins_play to load vars for managed_node3 12033 1726867182.50912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867182.52582: done with get_vars() 12033 1726867182.52605: done getting variables 12033 1726867182.52660: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867182.52795: variable 'profile' from source: include params 12033 1726867182.52799: variable 'bond_port_profile' from source: include params 12033 1726867182.52858: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 17:19:42 -0400 (0:00:00.085) 0:00:21.644 ****** 12033 1726867182.52905: entering _queue_task() for managed_node3/assert 12033 1726867182.53598: worker is 1 (out of 1 available) 12033 1726867182.53612: exiting _queue_task() for managed_node3/assert 12033 1726867182.53625: done queuing things up, now waiting for results queue to drain 12033 1726867182.53626: waiting for pending results... 12033 1726867182.54105: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' 12033 1726867182.54222: in run() - task 0affcac9-a3a5-74bb-502b-0000000004e2 12033 1726867182.54237: variable 'ansible_search_path' from source: unknown 12033 1726867182.54241: variable 'ansible_search_path' from source: unknown 12033 1726867182.54384: calling self._execute() 12033 1726867182.54468: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.54475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.54486: variable 'omit' from source: magic vars 12033 1726867182.55386: variable 'ansible_distribution_major_version' from source: facts 12033 1726867182.55408: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867182.55411: variable 'omit' from source: magic vars 12033 1726867182.55461: variable 'omit' from source: magic vars 12033 1726867182.55587: variable 'profile' from source: include params 12033 1726867182.55597: variable 'bond_port_profile' from source: include params 12033 1726867182.55695: variable 'bond_port_profile' from source: include params 12033 1726867182.55721: variable 'omit' from source: magic vars 12033 1726867182.55761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867182.55816: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867182.56021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867182.56029: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867182.56032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867182.56034: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867182.56037: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.56039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.56041: Set connection var ansible_pipelining to False 12033 1726867182.56043: Set connection var ansible_shell_executable to /bin/sh 12033 1726867182.56045: Set connection var ansible_timeout to 10 12033 1726867182.56047: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867182.56050: Set connection var ansible_connection to ssh 12033 1726867182.56052: Set connection var ansible_shell_type to sh 12033 1726867182.56086: variable 'ansible_shell_executable' from source: unknown 12033 1726867182.56089: variable 'ansible_connection' from source: unknown 12033 1726867182.56091: variable 'ansible_module_compression' from source: unknown 12033 1726867182.56093: variable 'ansible_shell_type' from source: unknown 12033 1726867182.56095: variable 'ansible_shell_executable' from source: unknown 12033 1726867182.56097: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.56099: variable 'ansible_pipelining' from source: unknown 12033 1726867182.56102: variable 'ansible_timeout' from source: unknown 12033 1726867182.56104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.56257: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867182.56269: variable 'omit' from source: magic vars 12033 1726867182.56275: starting attempt loop 12033 1726867182.56302: running the handler 12033 1726867182.56483: variable 'lsr_net_profile_ansible_managed' from source: set_fact 12033 1726867182.56486: Evaluated conditional (lsr_net_profile_ansible_managed): True 12033 1726867182.56489: handler run complete 12033 1726867182.56491: attempt loop complete, returning result 12033 1726867182.56493: _execute() done 12033 1726867182.56495: dumping result to json 12033 1726867182.56497: done dumping result, returning 12033 1726867182.56499: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' [0affcac9-a3a5-74bb-502b-0000000004e2] 12033 1726867182.56500: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004e2 12033 1726867182.56557: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004e2 12033 1726867182.56559: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12033 1726867182.56622: no more pending results, returning what we have 12033 1726867182.56626: results queue empty 12033 1726867182.56628: checking for any_errors_fatal 12033 1726867182.56635: done checking for any_errors_fatal 12033 1726867182.56636: checking for max_fail_percentage 12033 1726867182.56638: done checking for max_fail_percentage 12033 1726867182.56639: checking to see if all hosts have failed and the running result is not ok 12033 1726867182.56640: done checking to see if all hosts have failed 12033 1726867182.56641: getting the remaining hosts for this loop 12033 1726867182.56643: done getting the remaining hosts for this loop 12033 1726867182.56647: getting the next task for host managed_node3 12033 1726867182.56655: done getting next task for host managed_node3 12033 1726867182.56658: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 12033 1726867182.56663: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867182.56668: getting variables 12033 1726867182.56669: in VariableManager get_vars() 12033 1726867182.56707: Calling all_inventory to load vars for managed_node3 12033 1726867182.56710: Calling groups_inventory to load vars for managed_node3 12033 1726867182.56714: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867182.56726: Calling all_plugins_play to load vars for managed_node3 12033 1726867182.56729: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867182.56733: Calling groups_plugins_play to load vars for managed_node3 12033 1726867182.58508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867182.61598: done with get_vars() 12033 1726867182.61618: done getting variables 12033 1726867182.61729: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867182.62121: variable 'profile' from source: include params 12033 1726867182.62126: variable 'bond_port_profile' from source: include params 12033 1726867182.62188: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 17:19:42 -0400 (0:00:00.093) 0:00:21.737 ****** 12033 1726867182.62222: entering _queue_task() for managed_node3/assert 12033 1726867182.62755: worker is 1 (out of 1 available) 12033 1726867182.62768: exiting _queue_task() for managed_node3/assert 12033 1726867182.62783: done queuing things up, now waiting for results queue to drain 12033 1726867182.62785: waiting for pending results... 12033 1726867182.63200: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 12033 1726867182.63321: in run() - task 0affcac9-a3a5-74bb-502b-0000000004e3 12033 1726867182.63325: variable 'ansible_search_path' from source: unknown 12033 1726867182.63328: variable 'ansible_search_path' from source: unknown 12033 1726867182.63331: calling self._execute() 12033 1726867182.63395: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.63407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.63425: variable 'omit' from source: magic vars 12033 1726867182.63815: variable 'ansible_distribution_major_version' from source: facts 12033 1726867182.63831: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867182.63840: variable 'omit' from source: magic vars 12033 1726867182.63914: variable 'omit' from source: magic vars 12033 1726867182.64084: variable 'profile' from source: include params 12033 1726867182.64088: variable 'bond_port_profile' from source: include params 12033 1726867182.64110: variable 'bond_port_profile' from source: include params 12033 1726867182.64138: variable 'omit' from source: magic vars 12033 1726867182.64196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867182.64238: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867182.64263: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867182.64292: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867182.64318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867182.64364: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867182.64382: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.64385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.64519: Set connection var ansible_pipelining to False 12033 1726867182.64522: Set connection var ansible_shell_executable to /bin/sh 12033 1726867182.64524: Set connection var ansible_timeout to 10 12033 1726867182.64526: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867182.64532: Set connection var ansible_connection to ssh 12033 1726867182.64541: Set connection var ansible_shell_type to sh 12033 1726867182.64564: variable 'ansible_shell_executable' from source: unknown 12033 1726867182.64629: variable 'ansible_connection' from source: unknown 12033 1726867182.64632: variable 'ansible_module_compression' from source: unknown 12033 1726867182.64634: variable 'ansible_shell_type' from source: unknown 12033 1726867182.64636: variable 'ansible_shell_executable' from source: unknown 12033 1726867182.64638: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.64640: variable 'ansible_pipelining' from source: unknown 12033 1726867182.64642: variable 'ansible_timeout' from source: unknown 12033 1726867182.64644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.64769: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867182.64794: variable 'omit' from source: magic vars 12033 1726867182.64808: starting attempt loop 12033 1726867182.64816: running the handler 12033 1726867182.64936: variable 'lsr_net_profile_fingerprint' from source: set_fact 12033 1726867182.64945: Evaluated conditional (lsr_net_profile_fingerprint): True 12033 1726867182.64984: handler run complete 12033 1726867182.64987: attempt loop complete, returning result 12033 1726867182.64992: _execute() done 12033 1726867182.64995: dumping result to json 12033 1726867182.64997: done dumping result, returning 12033 1726867182.65063: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 [0affcac9-a3a5-74bb-502b-0000000004e3] 12033 1726867182.65066: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004e3 12033 1726867182.65213: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004e3 12033 1726867182.65216: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12033 1726867182.65264: no more pending results, returning what we have 12033 1726867182.65268: results queue empty 12033 1726867182.65269: checking for any_errors_fatal 12033 1726867182.65276: done checking for any_errors_fatal 12033 1726867182.65276: checking for max_fail_percentage 12033 1726867182.65280: done checking for max_fail_percentage 12033 1726867182.65282: checking to see if all hosts have failed and the running result is not ok 12033 1726867182.65282: done checking to see if all hosts have failed 12033 1726867182.65283: getting the remaining hosts for this loop 12033 1726867182.65285: done getting the remaining hosts for this loop 12033 1726867182.65289: getting the next task for host managed_node3 12033 1726867182.65305: done getting next task for host managed_node3 12033 1726867182.65309: ^ task is: TASK: Include the task 'get_profile_stat.yml' 12033 1726867182.65314: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867182.65319: getting variables 12033 1726867182.65321: in VariableManager get_vars() 12033 1726867182.65355: Calling all_inventory to load vars for managed_node3 12033 1726867182.65358: Calling groups_inventory to load vars for managed_node3 12033 1726867182.65362: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867182.65374: Calling all_plugins_play to load vars for managed_node3 12033 1726867182.65553: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867182.65560: Calling groups_plugins_play to load vars for managed_node3 12033 1726867182.67159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867182.68724: done with get_vars() 12033 1726867182.68745: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 17:19:42 -0400 (0:00:00.066) 0:00:21.804 ****** 12033 1726867182.68843: entering _queue_task() for managed_node3/include_tasks 12033 1726867182.69140: worker is 1 (out of 1 available) 12033 1726867182.69152: exiting _queue_task() for managed_node3/include_tasks 12033 1726867182.69165: done queuing things up, now waiting for results queue to drain 12033 1726867182.69166: waiting for pending results... 12033 1726867182.69498: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 12033 1726867182.69542: in run() - task 0affcac9-a3a5-74bb-502b-0000000004e7 12033 1726867182.69564: variable 'ansible_search_path' from source: unknown 12033 1726867182.69572: variable 'ansible_search_path' from source: unknown 12033 1726867182.69621: calling self._execute() 12033 1726867182.69883: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.69887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.69892: variable 'omit' from source: magic vars 12033 1726867182.70130: variable 'ansible_distribution_major_version' from source: facts 12033 1726867182.70147: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867182.70158: _execute() done 12033 1726867182.70166: dumping result to json 12033 1726867182.70173: done dumping result, returning 12033 1726867182.70185: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-74bb-502b-0000000004e7] 12033 1726867182.70199: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004e7 12033 1726867182.70337: no more pending results, returning what we have 12033 1726867182.70342: in VariableManager get_vars() 12033 1726867182.70375: Calling all_inventory to load vars for managed_node3 12033 1726867182.70379: Calling groups_inventory to load vars for managed_node3 12033 1726867182.70382: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867182.70487: Calling all_plugins_play to load vars for managed_node3 12033 1726867182.70491: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867182.70495: Calling groups_plugins_play to load vars for managed_node3 12033 1726867182.71016: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004e7 12033 1726867182.71019: WORKER PROCESS EXITING 12033 1726867182.72245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867182.73833: done with get_vars() 12033 1726867182.73851: variable 'ansible_search_path' from source: unknown 12033 1726867182.73852: variable 'ansible_search_path' from source: unknown 12033 1726867182.73887: we have included files to process 12033 1726867182.73888: generating all_blocks data 12033 1726867182.73892: done generating all_blocks data 12033 1726867182.73896: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12033 1726867182.73897: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12033 1726867182.73899: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12033 1726867182.74838: done processing included file 12033 1726867182.74840: iterating over new_blocks loaded from include file 12033 1726867182.74841: in VariableManager get_vars() 12033 1726867182.74857: done with get_vars() 12033 1726867182.74859: filtering new block on tags 12033 1726867182.74948: done filtering new block on tags 12033 1726867182.74951: in VariableManager get_vars() 12033 1726867182.74967: done with get_vars() 12033 1726867182.74969: filtering new block on tags 12033 1726867182.75040: done filtering new block on tags 12033 1726867182.75043: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 12033 1726867182.75048: extending task lists for all hosts with included blocks 12033 1726867182.75499: done extending task lists 12033 1726867182.75501: done processing included files 12033 1726867182.75501: results queue empty 12033 1726867182.75502: checking for any_errors_fatal 12033 1726867182.75505: done checking for any_errors_fatal 12033 1726867182.75506: checking for max_fail_percentage 12033 1726867182.75507: done checking for max_fail_percentage 12033 1726867182.75508: checking to see if all hosts have failed and the running result is not ok 12033 1726867182.75508: done checking to see if all hosts have failed 12033 1726867182.75509: getting the remaining hosts for this loop 12033 1726867182.75510: done getting the remaining hosts for this loop 12033 1726867182.75513: getting the next task for host managed_node3 12033 1726867182.75517: done getting next task for host managed_node3 12033 1726867182.75519: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 12033 1726867182.75523: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867182.75525: getting variables 12033 1726867182.75526: in VariableManager get_vars() 12033 1726867182.75534: Calling all_inventory to load vars for managed_node3 12033 1726867182.75536: Calling groups_inventory to load vars for managed_node3 12033 1726867182.75538: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867182.75543: Calling all_plugins_play to load vars for managed_node3 12033 1726867182.75546: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867182.75549: Calling groups_plugins_play to load vars for managed_node3 12033 1726867182.76696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867182.78301: done with get_vars() 12033 1726867182.78323: done getting variables 12033 1726867182.78369: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:19:42 -0400 (0:00:00.095) 0:00:21.899 ****** 12033 1726867182.78409: entering _queue_task() for managed_node3/set_fact 12033 1726867182.78755: worker is 1 (out of 1 available) 12033 1726867182.78767: exiting _queue_task() for managed_node3/set_fact 12033 1726867182.78782: done queuing things up, now waiting for results queue to drain 12033 1726867182.78784: waiting for pending results... 12033 1726867182.79092: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 12033 1726867182.79206: in run() - task 0affcac9-a3a5-74bb-502b-0000000005b4 12033 1726867182.79232: variable 'ansible_search_path' from source: unknown 12033 1726867182.79236: variable 'ansible_search_path' from source: unknown 12033 1726867182.79382: calling self._execute() 12033 1726867182.79386: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.79392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.79395: variable 'omit' from source: magic vars 12033 1726867182.79715: variable 'ansible_distribution_major_version' from source: facts 12033 1726867182.79726: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867182.79732: variable 'omit' from source: magic vars 12033 1726867182.79798: variable 'omit' from source: magic vars 12033 1726867182.79831: variable 'omit' from source: magic vars 12033 1726867182.79865: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867182.79906: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867182.79924: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867182.79944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867182.79950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867182.79979: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867182.79982: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.80055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.80080: Set connection var ansible_pipelining to False 12033 1726867182.80089: Set connection var ansible_shell_executable to /bin/sh 12033 1726867182.80096: Set connection var ansible_timeout to 10 12033 1726867182.80107: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867182.80110: Set connection var ansible_connection to ssh 12033 1726867182.80116: Set connection var ansible_shell_type to sh 12033 1726867182.80139: variable 'ansible_shell_executable' from source: unknown 12033 1726867182.80142: variable 'ansible_connection' from source: unknown 12033 1726867182.80145: variable 'ansible_module_compression' from source: unknown 12033 1726867182.80148: variable 'ansible_shell_type' from source: unknown 12033 1726867182.80150: variable 'ansible_shell_executable' from source: unknown 12033 1726867182.80152: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.80155: variable 'ansible_pipelining' from source: unknown 12033 1726867182.80279: variable 'ansible_timeout' from source: unknown 12033 1726867182.80283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.80300: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867182.80311: variable 'omit' from source: magic vars 12033 1726867182.80321: starting attempt loop 12033 1726867182.80324: running the handler 12033 1726867182.80336: handler run complete 12033 1726867182.80346: attempt loop complete, returning result 12033 1726867182.80349: _execute() done 12033 1726867182.80352: dumping result to json 12033 1726867182.80354: done dumping result, returning 12033 1726867182.80361: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-74bb-502b-0000000005b4] 12033 1726867182.80400: sending task result for task 0affcac9-a3a5-74bb-502b-0000000005b4 12033 1726867182.80461: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000005b4 12033 1726867182.80464: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 12033 1726867182.80550: no more pending results, returning what we have 12033 1726867182.80554: results queue empty 12033 1726867182.80555: checking for any_errors_fatal 12033 1726867182.80557: done checking for any_errors_fatal 12033 1726867182.80558: checking for max_fail_percentage 12033 1726867182.80560: done checking for max_fail_percentage 12033 1726867182.80560: checking to see if all hosts have failed and the running result is not ok 12033 1726867182.80561: done checking to see if all hosts have failed 12033 1726867182.80562: getting the remaining hosts for this loop 12033 1726867182.80564: done getting the remaining hosts for this loop 12033 1726867182.80567: getting the next task for host managed_node3 12033 1726867182.80574: done getting next task for host managed_node3 12033 1726867182.80576: ^ task is: TASK: Stat profile file 12033 1726867182.80583: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867182.80586: getting variables 12033 1726867182.80587: in VariableManager get_vars() 12033 1726867182.80615: Calling all_inventory to load vars for managed_node3 12033 1726867182.80617: Calling groups_inventory to load vars for managed_node3 12033 1726867182.80620: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867182.80629: Calling all_plugins_play to load vars for managed_node3 12033 1726867182.80631: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867182.80633: Calling groups_plugins_play to load vars for managed_node3 12033 1726867182.82366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867182.84010: done with get_vars() 12033 1726867182.84034: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:19:42 -0400 (0:00:00.057) 0:00:21.957 ****** 12033 1726867182.84146: entering _queue_task() for managed_node3/stat 12033 1726867182.84455: worker is 1 (out of 1 available) 12033 1726867182.84468: exiting _queue_task() for managed_node3/stat 12033 1726867182.84482: done queuing things up, now waiting for results queue to drain 12033 1726867182.84488: waiting for pending results... 12033 1726867182.84894: running TaskExecutor() for managed_node3/TASK: Stat profile file 12033 1726867182.84908: in run() - task 0affcac9-a3a5-74bb-502b-0000000005b5 12033 1726867182.84929: variable 'ansible_search_path' from source: unknown 12033 1726867182.84935: variable 'ansible_search_path' from source: unknown 12033 1726867182.84970: calling self._execute() 12033 1726867182.85069: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.85081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.85097: variable 'omit' from source: magic vars 12033 1726867182.85462: variable 'ansible_distribution_major_version' from source: facts 12033 1726867182.85555: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867182.85559: variable 'omit' from source: magic vars 12033 1726867182.85565: variable 'omit' from source: magic vars 12033 1726867182.85673: variable 'profile' from source: include params 12033 1726867182.85693: variable 'bond_port_profile' from source: include params 12033 1726867182.85759: variable 'bond_port_profile' from source: include params 12033 1726867182.85880: variable 'omit' from source: magic vars 12033 1726867182.85883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867182.85886: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867182.85909: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867182.85931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867182.85949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867182.85985: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867182.86005: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.86015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.86127: Set connection var ansible_pipelining to False 12033 1726867182.86142: Set connection var ansible_shell_executable to /bin/sh 12033 1726867182.86182: Set connection var ansible_timeout to 10 12033 1726867182.86186: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867182.86188: Set connection var ansible_connection to ssh 12033 1726867182.86193: Set connection var ansible_shell_type to sh 12033 1726867182.86207: variable 'ansible_shell_executable' from source: unknown 12033 1726867182.86224: variable 'ansible_connection' from source: unknown 12033 1726867182.86232: variable 'ansible_module_compression' from source: unknown 12033 1726867182.86327: variable 'ansible_shell_type' from source: unknown 12033 1726867182.86330: variable 'ansible_shell_executable' from source: unknown 12033 1726867182.86333: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867182.86335: variable 'ansible_pipelining' from source: unknown 12033 1726867182.86337: variable 'ansible_timeout' from source: unknown 12033 1726867182.86339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867182.86488: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867182.86511: variable 'omit' from source: magic vars 12033 1726867182.86522: starting attempt loop 12033 1726867182.86529: running the handler 12033 1726867182.86554: _low_level_execute_command(): starting 12033 1726867182.86653: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867182.87472: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867182.87566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867182.87601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867182.87624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867182.87722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867182.89417: stdout chunk (state=3): >>>/root <<< 12033 1726867182.89572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867182.89576: stdout chunk (state=3): >>><<< 12033 1726867182.89580: stderr chunk (state=3): >>><<< 12033 1726867182.89598: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867182.89619: _low_level_execute_command(): starting 12033 1726867182.89683: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867182.8960443-13061-70736368836890 `" && echo ansible-tmp-1726867182.8960443-13061-70736368836890="` echo /root/.ansible/tmp/ansible-tmp-1726867182.8960443-13061-70736368836890 `" ) && sleep 0' 12033 1726867182.90327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867182.90340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867182.90356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867182.90467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867182.90603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867182.90755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867182.90810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867182.92741: stdout chunk (state=3): >>>ansible-tmp-1726867182.8960443-13061-70736368836890=/root/.ansible/tmp/ansible-tmp-1726867182.8960443-13061-70736368836890 <<< 12033 1726867182.92906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867182.92909: stdout chunk (state=3): >>><<< 12033 1726867182.92914: stderr chunk (state=3): >>><<< 12033 1726867182.93087: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867182.8960443-13061-70736368836890=/root/.ansible/tmp/ansible-tmp-1726867182.8960443-13061-70736368836890 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867182.93092: variable 'ansible_module_compression' from source: unknown 12033 1726867182.93094: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12033 1726867182.93096: variable 'ansible_facts' from source: unknown 12033 1726867182.93188: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867182.8960443-13061-70736368836890/AnsiballZ_stat.py 12033 1726867182.93338: Sending initial data 12033 1726867182.93345: Sent initial data (152 bytes) 12033 1726867182.93953: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867182.93993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867182.94010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867182.94104: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867182.94127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867182.94144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867182.94166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867182.94248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867182.95805: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867182.95869: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867182.95953: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmplbk2vlt8 /root/.ansible/tmp/ansible-tmp-1726867182.8960443-13061-70736368836890/AnsiballZ_stat.py <<< 12033 1726867182.95971: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867182.8960443-13061-70736368836890/AnsiballZ_stat.py" <<< 12033 1726867182.96006: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 12033 1726867182.96020: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmplbk2vlt8" to remote "/root/.ansible/tmp/ansible-tmp-1726867182.8960443-13061-70736368836890/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867182.8960443-13061-70736368836890/AnsiballZ_stat.py" <<< 12033 1726867182.96885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867182.96889: stderr chunk (state=3): >>><<< 12033 1726867182.96895: stdout chunk (state=3): >>><<< 12033 1726867182.96905: done transferring module to remote 12033 1726867182.96920: _low_level_execute_command(): starting 12033 1726867182.96940: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867182.8960443-13061-70736368836890/ /root/.ansible/tmp/ansible-tmp-1726867182.8960443-13061-70736368836890/AnsiballZ_stat.py && sleep 0' 12033 1726867182.97536: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867182.97599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867182.97656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867182.97673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867182.97717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867182.97785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867182.99585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867182.99594: stdout chunk (state=3): >>><<< 12033 1726867182.99597: stderr chunk (state=3): >>><<< 12033 1726867182.99620: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867182.99700: _low_level_execute_command(): starting 12033 1726867182.99704: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867182.8960443-13061-70736368836890/AnsiballZ_stat.py && sleep 0' 12033 1726867183.00266: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867183.00281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867183.00358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867183.00411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867183.00434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867183.00525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867183.15623: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12033 1726867183.17078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867183.17082: stdout chunk (state=3): >>><<< 12033 1726867183.17084: stderr chunk (state=3): >>><<< 12033 1726867183.17183: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867183.17187: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867182.8960443-13061-70736368836890/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867183.17189: _low_level_execute_command(): starting 12033 1726867183.17192: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867182.8960443-13061-70736368836890/ > /dev/null 2>&1 && sleep 0' 12033 1726867183.17762: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867183.17776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867183.17795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867183.17844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867183.17913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867183.17931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867183.17963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867183.18053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867183.20082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867183.20086: stdout chunk (state=3): >>><<< 12033 1726867183.20089: stderr chunk (state=3): >>><<< 12033 1726867183.20093: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867183.20101: handler run complete 12033 1726867183.20104: attempt loop complete, returning result 12033 1726867183.20105: _execute() done 12033 1726867183.20107: dumping result to json 12033 1726867183.20108: done dumping result, returning 12033 1726867183.20110: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcac9-a3a5-74bb-502b-0000000005b5] 12033 1726867183.20112: sending task result for task 0affcac9-a3a5-74bb-502b-0000000005b5 12033 1726867183.20173: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000005b5 12033 1726867183.20175: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 12033 1726867183.20266: no more pending results, returning what we have 12033 1726867183.20270: results queue empty 12033 1726867183.20272: checking for any_errors_fatal 12033 1726867183.20286: done checking for any_errors_fatal 12033 1726867183.20287: checking for max_fail_percentage 12033 1726867183.20292: done checking for max_fail_percentage 12033 1726867183.20293: checking to see if all hosts have failed and the running result is not ok 12033 1726867183.20294: done checking to see if all hosts have failed 12033 1726867183.20294: getting the remaining hosts for this loop 12033 1726867183.20296: done getting the remaining hosts for this loop 12033 1726867183.20307: getting the next task for host managed_node3 12033 1726867183.20316: done getting next task for host managed_node3 12033 1726867183.20318: ^ task is: TASK: Set NM profile exist flag based on the profile files 12033 1726867183.20323: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867183.20328: getting variables 12033 1726867183.20329: in VariableManager get_vars() 12033 1726867183.20362: Calling all_inventory to load vars for managed_node3 12033 1726867183.20366: Calling groups_inventory to load vars for managed_node3 12033 1726867183.20369: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867183.20484: Calling all_plugins_play to load vars for managed_node3 12033 1726867183.20488: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867183.20495: Calling groups_plugins_play to load vars for managed_node3 12033 1726867183.21951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867183.23932: done with get_vars() 12033 1726867183.23955: done getting variables 12033 1726867183.24026: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:19:43 -0400 (0:00:00.399) 0:00:22.356 ****** 12033 1726867183.24061: entering _queue_task() for managed_node3/set_fact 12033 1726867183.24382: worker is 1 (out of 1 available) 12033 1726867183.24400: exiting _queue_task() for managed_node3/set_fact 12033 1726867183.24412: done queuing things up, now waiting for results queue to drain 12033 1726867183.24414: waiting for pending results... 12033 1726867183.24619: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 12033 1726867183.24763: in run() - task 0affcac9-a3a5-74bb-502b-0000000005b6 12033 1726867183.24787: variable 'ansible_search_path' from source: unknown 12033 1726867183.24796: variable 'ansible_search_path' from source: unknown 12033 1726867183.24836: calling self._execute() 12033 1726867183.24936: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867183.24948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867183.24970: variable 'omit' from source: magic vars 12033 1726867183.25382: variable 'ansible_distribution_major_version' from source: facts 12033 1726867183.25386: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867183.25474: variable 'profile_stat' from source: set_fact 12033 1726867183.25494: Evaluated conditional (profile_stat.stat.exists): False 12033 1726867183.25508: when evaluation is False, skipping this task 12033 1726867183.25518: _execute() done 12033 1726867183.25527: dumping result to json 12033 1726867183.25534: done dumping result, returning 12033 1726867183.25544: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-74bb-502b-0000000005b6] 12033 1726867183.25554: sending task result for task 0affcac9-a3a5-74bb-502b-0000000005b6 12033 1726867183.25691: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000005b6 12033 1726867183.25695: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12033 1726867183.25747: no more pending results, returning what we have 12033 1726867183.25750: results queue empty 12033 1726867183.25752: checking for any_errors_fatal 12033 1726867183.25761: done checking for any_errors_fatal 12033 1726867183.25762: checking for max_fail_percentage 12033 1726867183.25764: done checking for max_fail_percentage 12033 1726867183.25765: checking to see if all hosts have failed and the running result is not ok 12033 1726867183.25766: done checking to see if all hosts have failed 12033 1726867183.25766: getting the remaining hosts for this loop 12033 1726867183.25768: done getting the remaining hosts for this loop 12033 1726867183.25771: getting the next task for host managed_node3 12033 1726867183.25938: done getting next task for host managed_node3 12033 1726867183.25941: ^ task is: TASK: Get NM profile info 12033 1726867183.26078: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867183.26083: getting variables 12033 1726867183.26085: in VariableManager get_vars() 12033 1726867183.26111: Calling all_inventory to load vars for managed_node3 12033 1726867183.26114: Calling groups_inventory to load vars for managed_node3 12033 1726867183.26117: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867183.26126: Calling all_plugins_play to load vars for managed_node3 12033 1726867183.26129: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867183.26132: Calling groups_plugins_play to load vars for managed_node3 12033 1726867183.32421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867183.33951: done with get_vars() 12033 1726867183.33976: done getting variables 12033 1726867183.34034: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:19:43 -0400 (0:00:00.100) 0:00:22.456 ****** 12033 1726867183.34067: entering _queue_task() for managed_node3/shell 12033 1726867183.34407: worker is 1 (out of 1 available) 12033 1726867183.34420: exiting _queue_task() for managed_node3/shell 12033 1726867183.34432: done queuing things up, now waiting for results queue to drain 12033 1726867183.34434: waiting for pending results... 12033 1726867183.34883: running TaskExecutor() for managed_node3/TASK: Get NM profile info 12033 1726867183.34888: in run() - task 0affcac9-a3a5-74bb-502b-0000000005b7 12033 1726867183.34892: variable 'ansible_search_path' from source: unknown 12033 1726867183.34895: variable 'ansible_search_path' from source: unknown 12033 1726867183.35089: calling self._execute() 12033 1726867183.35094: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867183.35097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867183.35100: variable 'omit' from source: magic vars 12033 1726867183.35390: variable 'ansible_distribution_major_version' from source: facts 12033 1726867183.35404: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867183.35410: variable 'omit' from source: magic vars 12033 1726867183.35683: variable 'omit' from source: magic vars 12033 1726867183.35687: variable 'profile' from source: include params 12033 1726867183.35689: variable 'bond_port_profile' from source: include params 12033 1726867183.35694: variable 'bond_port_profile' from source: include params 12033 1726867183.35696: variable 'omit' from source: magic vars 12033 1726867183.35713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867183.35751: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867183.35781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867183.35806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867183.35876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867183.35881: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867183.35884: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867183.35886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867183.35966: Set connection var ansible_pipelining to False 12033 1726867183.35981: Set connection var ansible_shell_executable to /bin/sh 12033 1726867183.35999: Set connection var ansible_timeout to 10 12033 1726867183.36008: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867183.36015: Set connection var ansible_connection to ssh 12033 1726867183.36023: Set connection var ansible_shell_type to sh 12033 1726867183.36048: variable 'ansible_shell_executable' from source: unknown 12033 1726867183.36081: variable 'ansible_connection' from source: unknown 12033 1726867183.36084: variable 'ansible_module_compression' from source: unknown 12033 1726867183.36086: variable 'ansible_shell_type' from source: unknown 12033 1726867183.36089: variable 'ansible_shell_executable' from source: unknown 12033 1726867183.36093: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867183.36095: variable 'ansible_pipelining' from source: unknown 12033 1726867183.36098: variable 'ansible_timeout' from source: unknown 12033 1726867183.36101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867183.36284: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867183.36287: variable 'omit' from source: magic vars 12033 1726867183.36289: starting attempt loop 12033 1726867183.36293: running the handler 12033 1726867183.36295: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867183.36297: _low_level_execute_command(): starting 12033 1726867183.36299: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867183.37016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867183.37085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867183.37148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867183.37179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867183.37208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867183.37287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867183.38986: stdout chunk (state=3): >>>/root <<< 12033 1726867183.39075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867183.39130: stderr chunk (state=3): >>><<< 12033 1726867183.39144: stdout chunk (state=3): >>><<< 12033 1726867183.39406: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867183.39410: _low_level_execute_command(): starting 12033 1726867183.39414: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867183.3931186-13083-16070265786991 `" && echo ansible-tmp-1726867183.3931186-13083-16070265786991="` echo /root/.ansible/tmp/ansible-tmp-1726867183.3931186-13083-16070265786991 `" ) && sleep 0' 12033 1726867183.40552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867183.40608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867183.40668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867183.40733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867183.40750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867183.40775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867183.40903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867183.42801: stdout chunk (state=3): >>>ansible-tmp-1726867183.3931186-13083-16070265786991=/root/.ansible/tmp/ansible-tmp-1726867183.3931186-13083-16070265786991 <<< 12033 1726867183.43150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867183.43153: stdout chunk (state=3): >>><<< 12033 1726867183.43156: stderr chunk (state=3): >>><<< 12033 1726867183.43159: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867183.3931186-13083-16070265786991=/root/.ansible/tmp/ansible-tmp-1726867183.3931186-13083-16070265786991 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867183.43162: variable 'ansible_module_compression' from source: unknown 12033 1726867183.43186: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867183.43423: variable 'ansible_facts' from source: unknown 12033 1726867183.43532: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867183.3931186-13083-16070265786991/AnsiballZ_command.py 12033 1726867183.43696: Sending initial data 12033 1726867183.43706: Sent initial data (155 bytes) 12033 1726867183.44955: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867183.44984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867183.45068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867183.45095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867183.45225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867183.46836: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867183.46878: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867183.46915: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmp68v30cn5 /root/.ansible/tmp/ansible-tmp-1726867183.3931186-13083-16070265786991/AnsiballZ_command.py <<< 12033 1726867183.46919: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867183.3931186-13083-16070265786991/AnsiballZ_command.py" <<< 12033 1726867183.47030: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmp68v30cn5" to remote "/root/.ansible/tmp/ansible-tmp-1726867183.3931186-13083-16070265786991/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867183.3931186-13083-16070265786991/AnsiballZ_command.py" <<< 12033 1726867183.48360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867183.48363: stdout chunk (state=3): >>><<< 12033 1726867183.48372: stderr chunk (state=3): >>><<< 12033 1726867183.48507: done transferring module to remote 12033 1726867183.48518: _low_level_execute_command(): starting 12033 1726867183.48523: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867183.3931186-13083-16070265786991/ /root/.ansible/tmp/ansible-tmp-1726867183.3931186-13083-16070265786991/AnsiballZ_command.py && sleep 0' 12033 1726867183.49784: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867183.49793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867183.49796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867183.49799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867183.49802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867183.49809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867183.49812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867183.49919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867183.49926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867183.49997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867183.50143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867183.51910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867183.51913: stderr chunk (state=3): >>><<< 12033 1726867183.51918: stdout chunk (state=3): >>><<< 12033 1726867183.51936: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867183.51939: _low_level_execute_command(): starting 12033 1726867183.51942: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867183.3931186-13083-16070265786991/AnsiballZ_command.py && sleep 0' 12033 1726867183.53059: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867183.53207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867183.53291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867183.53322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867183.53432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867183.53436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867183.53532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867183.70723: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 17:19:43.684245", "end": "2024-09-20 17:19:43.704861", "delta": "0:00:00.020616", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867183.72241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867183.72257: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 12033 1726867183.72313: stderr chunk (state=3): >>><<< 12033 1726867183.72584: stdout chunk (state=3): >>><<< 12033 1726867183.72588: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 17:19:43.684245", "end": "2024-09-20 17:19:43.704861", "delta": "0:00:00.020616", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867183.72593: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867183.3931186-13083-16070265786991/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867183.72596: _low_level_execute_command(): starting 12033 1726867183.72598: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867183.3931186-13083-16070265786991/ > /dev/null 2>&1 && sleep 0' 12033 1726867183.73474: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867183.73524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867183.73547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867183.73619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867183.73702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867183.73842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867183.73864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867183.73903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867183.73963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867183.75782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867183.75785: stdout chunk (state=3): >>><<< 12033 1726867183.75794: stderr chunk (state=3): >>><<< 12033 1726867183.75812: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867183.75818: handler run complete 12033 1726867183.75842: Evaluated conditional (False): False 12033 1726867183.75853: attempt loop complete, returning result 12033 1726867183.75856: _execute() done 12033 1726867183.75858: dumping result to json 12033 1726867183.75863: done dumping result, returning 12033 1726867183.75871: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcac9-a3a5-74bb-502b-0000000005b7] 12033 1726867183.75875: sending task result for task 0affcac9-a3a5-74bb-502b-0000000005b7 12033 1726867183.75982: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000005b7 12033 1726867183.75985: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.020616", "end": "2024-09-20 17:19:43.704861", "rc": 0, "start": "2024-09-20 17:19:43.684245" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 12033 1726867183.76063: no more pending results, returning what we have 12033 1726867183.76067: results queue empty 12033 1726867183.76068: checking for any_errors_fatal 12033 1726867183.76083: done checking for any_errors_fatal 12033 1726867183.76084: checking for max_fail_percentage 12033 1726867183.76088: done checking for max_fail_percentage 12033 1726867183.76089: checking to see if all hosts have failed and the running result is not ok 12033 1726867183.76092: done checking to see if all hosts have failed 12033 1726867183.76093: getting the remaining hosts for this loop 12033 1726867183.76095: done getting the remaining hosts for this loop 12033 1726867183.76099: getting the next task for host managed_node3 12033 1726867183.76107: done getting next task for host managed_node3 12033 1726867183.76110: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12033 1726867183.76116: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867183.76121: getting variables 12033 1726867183.76123: in VariableManager get_vars() 12033 1726867183.76157: Calling all_inventory to load vars for managed_node3 12033 1726867183.76160: Calling groups_inventory to load vars for managed_node3 12033 1726867183.76164: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867183.76175: Calling all_plugins_play to load vars for managed_node3 12033 1726867183.76300: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867183.76305: Calling groups_plugins_play to load vars for managed_node3 12033 1726867183.79028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867183.80917: done with get_vars() 12033 1726867183.80945: done getting variables 12033 1726867183.81012: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:19:43 -0400 (0:00:00.470) 0:00:22.926 ****** 12033 1726867183.81081: entering _queue_task() for managed_node3/set_fact 12033 1726867183.81580: worker is 1 (out of 1 available) 12033 1726867183.81595: exiting _queue_task() for managed_node3/set_fact 12033 1726867183.81608: done queuing things up, now waiting for results queue to drain 12033 1726867183.81610: waiting for pending results... 12033 1726867183.81968: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12033 1726867183.81974: in run() - task 0affcac9-a3a5-74bb-502b-0000000005b8 12033 1726867183.81979: variable 'ansible_search_path' from source: unknown 12033 1726867183.81982: variable 'ansible_search_path' from source: unknown 12033 1726867183.82183: calling self._execute() 12033 1726867183.82187: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867183.82190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867183.82193: variable 'omit' from source: magic vars 12033 1726867183.82734: variable 'ansible_distribution_major_version' from source: facts 12033 1726867183.82746: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867183.83093: variable 'nm_profile_exists' from source: set_fact 12033 1726867183.83097: Evaluated conditional (nm_profile_exists.rc == 0): True 12033 1726867183.83099: variable 'omit' from source: magic vars 12033 1726867183.83102: variable 'omit' from source: magic vars 12033 1726867183.83104: variable 'omit' from source: magic vars 12033 1726867183.83106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867183.83109: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867183.83132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867183.83150: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867183.83161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867183.83195: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867183.83198: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867183.83201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867183.83335: Set connection var ansible_pipelining to False 12033 1726867183.83344: Set connection var ansible_shell_executable to /bin/sh 12033 1726867183.83353: Set connection var ansible_timeout to 10 12033 1726867183.83356: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867183.83359: Set connection var ansible_connection to ssh 12033 1726867183.83365: Set connection var ansible_shell_type to sh 12033 1726867183.83387: variable 'ansible_shell_executable' from source: unknown 12033 1726867183.83390: variable 'ansible_connection' from source: unknown 12033 1726867183.83392: variable 'ansible_module_compression' from source: unknown 12033 1726867183.83398: variable 'ansible_shell_type' from source: unknown 12033 1726867183.83400: variable 'ansible_shell_executable' from source: unknown 12033 1726867183.83403: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867183.83407: variable 'ansible_pipelining' from source: unknown 12033 1726867183.83410: variable 'ansible_timeout' from source: unknown 12033 1726867183.83413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867183.83682: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867183.83685: variable 'omit' from source: magic vars 12033 1726867183.83687: starting attempt loop 12033 1726867183.83689: running the handler 12033 1726867183.83690: handler run complete 12033 1726867183.83692: attempt loop complete, returning result 12033 1726867183.83694: _execute() done 12033 1726867183.83696: dumping result to json 12033 1726867183.83697: done dumping result, returning 12033 1726867183.83699: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-74bb-502b-0000000005b8] 12033 1726867183.83701: sending task result for task 0affcac9-a3a5-74bb-502b-0000000005b8 12033 1726867183.83758: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000005b8 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 12033 1726867183.83819: no more pending results, returning what we have 12033 1726867183.83823: results queue empty 12033 1726867183.83824: checking for any_errors_fatal 12033 1726867183.83832: done checking for any_errors_fatal 12033 1726867183.83833: checking for max_fail_percentage 12033 1726867183.83834: done checking for max_fail_percentage 12033 1726867183.83835: checking to see if all hosts have failed and the running result is not ok 12033 1726867183.83836: done checking to see if all hosts have failed 12033 1726867183.83837: getting the remaining hosts for this loop 12033 1726867183.83839: done getting the remaining hosts for this loop 12033 1726867183.83842: getting the next task for host managed_node3 12033 1726867183.83852: done getting next task for host managed_node3 12033 1726867183.84076: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 12033 1726867183.84084: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867183.84089: getting variables 12033 1726867183.84093: in VariableManager get_vars() 12033 1726867183.84119: Calling all_inventory to load vars for managed_node3 12033 1726867183.84121: Calling groups_inventory to load vars for managed_node3 12033 1726867183.84124: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867183.84133: Calling all_plugins_play to load vars for managed_node3 12033 1726867183.84136: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867183.84138: Calling groups_plugins_play to load vars for managed_node3 12033 1726867183.84690: WORKER PROCESS EXITING 12033 1726867183.86730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867183.89306: done with get_vars() 12033 1726867183.89330: done getting variables 12033 1726867183.89386: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867183.89499: variable 'profile' from source: include params 12033 1726867183.89503: variable 'bond_port_profile' from source: include params 12033 1726867183.89558: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:19:43 -0400 (0:00:00.085) 0:00:23.011 ****** 12033 1726867183.89592: entering _queue_task() for managed_node3/command 12033 1726867183.89894: worker is 1 (out of 1 available) 12033 1726867183.89907: exiting _queue_task() for managed_node3/command 12033 1726867183.89917: done queuing things up, now waiting for results queue to drain 12033 1726867183.90031: waiting for pending results... 12033 1726867183.90201: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 12033 1726867183.90329: in run() - task 0affcac9-a3a5-74bb-502b-0000000005ba 12033 1726867183.90342: variable 'ansible_search_path' from source: unknown 12033 1726867183.90347: variable 'ansible_search_path' from source: unknown 12033 1726867183.90394: calling self._execute() 12033 1726867183.90482: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867183.90489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867183.90498: variable 'omit' from source: magic vars 12033 1726867183.90865: variable 'ansible_distribution_major_version' from source: facts 12033 1726867183.90878: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867183.91008: variable 'profile_stat' from source: set_fact 12033 1726867183.91029: Evaluated conditional (profile_stat.stat.exists): False 12033 1726867183.91033: when evaluation is False, skipping this task 12033 1726867183.91035: _execute() done 12033 1726867183.91038: dumping result to json 12033 1726867183.91041: done dumping result, returning 12033 1726867183.91046: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [0affcac9-a3a5-74bb-502b-0000000005ba] 12033 1726867183.91052: sending task result for task 0affcac9-a3a5-74bb-502b-0000000005ba 12033 1726867183.91144: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000005ba 12033 1726867183.91148: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12033 1726867183.91201: no more pending results, returning what we have 12033 1726867183.91204: results queue empty 12033 1726867183.91206: checking for any_errors_fatal 12033 1726867183.91212: done checking for any_errors_fatal 12033 1726867183.91213: checking for max_fail_percentage 12033 1726867183.91215: done checking for max_fail_percentage 12033 1726867183.91216: checking to see if all hosts have failed and the running result is not ok 12033 1726867183.91217: done checking to see if all hosts have failed 12033 1726867183.91218: getting the remaining hosts for this loop 12033 1726867183.91220: done getting the remaining hosts for this loop 12033 1726867183.91225: getting the next task for host managed_node3 12033 1726867183.91235: done getting next task for host managed_node3 12033 1726867183.91238: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 12033 1726867183.91246: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867183.91251: getting variables 12033 1726867183.91252: in VariableManager get_vars() 12033 1726867183.91286: Calling all_inventory to load vars for managed_node3 12033 1726867183.91289: Calling groups_inventory to load vars for managed_node3 12033 1726867183.91292: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867183.91306: Calling all_plugins_play to load vars for managed_node3 12033 1726867183.91310: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867183.91313: Calling groups_plugins_play to load vars for managed_node3 12033 1726867183.92986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867183.94571: done with get_vars() 12033 1726867183.94596: done getting variables 12033 1726867183.94650: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867183.94756: variable 'profile' from source: include params 12033 1726867183.94760: variable 'bond_port_profile' from source: include params 12033 1726867183.94822: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:19:43 -0400 (0:00:00.052) 0:00:23.064 ****** 12033 1726867183.94854: entering _queue_task() for managed_node3/set_fact 12033 1726867183.95124: worker is 1 (out of 1 available) 12033 1726867183.95249: exiting _queue_task() for managed_node3/set_fact 12033 1726867183.95259: done queuing things up, now waiting for results queue to drain 12033 1726867183.95261: waiting for pending results... 12033 1726867183.95499: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 12033 1726867183.95565: in run() - task 0affcac9-a3a5-74bb-502b-0000000005bb 12033 1726867183.95682: variable 'ansible_search_path' from source: unknown 12033 1726867183.95686: variable 'ansible_search_path' from source: unknown 12033 1726867183.95692: calling self._execute() 12033 1726867183.95716: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867183.95723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867183.95731: variable 'omit' from source: magic vars 12033 1726867183.96114: variable 'ansible_distribution_major_version' from source: facts 12033 1726867183.96131: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867183.96259: variable 'profile_stat' from source: set_fact 12033 1726867183.96272: Evaluated conditional (profile_stat.stat.exists): False 12033 1726867183.96275: when evaluation is False, skipping this task 12033 1726867183.96280: _execute() done 12033 1726867183.96283: dumping result to json 12033 1726867183.96286: done dumping result, returning 12033 1726867183.96294: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [0affcac9-a3a5-74bb-502b-0000000005bb] 12033 1726867183.96296: sending task result for task 0affcac9-a3a5-74bb-502b-0000000005bb skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12033 1726867183.96427: no more pending results, returning what we have 12033 1726867183.96432: results queue empty 12033 1726867183.96433: checking for any_errors_fatal 12033 1726867183.96556: done checking for any_errors_fatal 12033 1726867183.96558: checking for max_fail_percentage 12033 1726867183.96560: done checking for max_fail_percentage 12033 1726867183.96561: checking to see if all hosts have failed and the running result is not ok 12033 1726867183.96561: done checking to see if all hosts have failed 12033 1726867183.96562: getting the remaining hosts for this loop 12033 1726867183.96563: done getting the remaining hosts for this loop 12033 1726867183.96567: getting the next task for host managed_node3 12033 1726867183.96574: done getting next task for host managed_node3 12033 1726867183.96576: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 12033 1726867183.96584: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867183.96587: getting variables 12033 1726867183.96589: in VariableManager get_vars() 12033 1726867183.96612: Calling all_inventory to load vars for managed_node3 12033 1726867183.96615: Calling groups_inventory to load vars for managed_node3 12033 1726867183.96618: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867183.96629: Calling all_plugins_play to load vars for managed_node3 12033 1726867183.96632: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867183.96636: Calling groups_plugins_play to load vars for managed_node3 12033 1726867183.97154: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000005bb 12033 1726867183.97158: WORKER PROCESS EXITING 12033 1726867183.97952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867183.99507: done with get_vars() 12033 1726867183.99526: done getting variables 12033 1726867183.99583: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867183.99689: variable 'profile' from source: include params 12033 1726867183.99696: variable 'bond_port_profile' from source: include params 12033 1726867183.99750: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:19:43 -0400 (0:00:00.049) 0:00:23.113 ****** 12033 1726867183.99784: entering _queue_task() for managed_node3/command 12033 1726867184.00057: worker is 1 (out of 1 available) 12033 1726867184.00070: exiting _queue_task() for managed_node3/command 12033 1726867184.00085: done queuing things up, now waiting for results queue to drain 12033 1726867184.00087: waiting for pending results... 12033 1726867184.00498: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 12033 1726867184.00505: in run() - task 0affcac9-a3a5-74bb-502b-0000000005bc 12033 1726867184.00524: variable 'ansible_search_path' from source: unknown 12033 1726867184.00532: variable 'ansible_search_path' from source: unknown 12033 1726867184.00568: calling self._execute() 12033 1726867184.00665: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.00689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.00709: variable 'omit' from source: magic vars 12033 1726867184.01074: variable 'ansible_distribution_major_version' from source: facts 12033 1726867184.01096: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867184.01223: variable 'profile_stat' from source: set_fact 12033 1726867184.01244: Evaluated conditional (profile_stat.stat.exists): False 12033 1726867184.01252: when evaluation is False, skipping this task 12033 1726867184.01260: _execute() done 12033 1726867184.01266: dumping result to json 12033 1726867184.01273: done dumping result, returning 12033 1726867184.01287: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 [0affcac9-a3a5-74bb-502b-0000000005bc] 12033 1726867184.01302: sending task result for task 0affcac9-a3a5-74bb-502b-0000000005bc 12033 1726867184.01584: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000005bc 12033 1726867184.01587: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12033 1726867184.01631: no more pending results, returning what we have 12033 1726867184.01635: results queue empty 12033 1726867184.01637: checking for any_errors_fatal 12033 1726867184.01642: done checking for any_errors_fatal 12033 1726867184.01642: checking for max_fail_percentage 12033 1726867184.01644: done checking for max_fail_percentage 12033 1726867184.01645: checking to see if all hosts have failed and the running result is not ok 12033 1726867184.01646: done checking to see if all hosts have failed 12033 1726867184.01647: getting the remaining hosts for this loop 12033 1726867184.01648: done getting the remaining hosts for this loop 12033 1726867184.01651: getting the next task for host managed_node3 12033 1726867184.01658: done getting next task for host managed_node3 12033 1726867184.01661: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 12033 1726867184.01667: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867184.01671: getting variables 12033 1726867184.01672: in VariableManager get_vars() 12033 1726867184.01701: Calling all_inventory to load vars for managed_node3 12033 1726867184.01704: Calling groups_inventory to load vars for managed_node3 12033 1726867184.01708: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867184.01718: Calling all_plugins_play to load vars for managed_node3 12033 1726867184.01721: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867184.01724: Calling groups_plugins_play to load vars for managed_node3 12033 1726867184.03222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867184.05806: done with get_vars() 12033 1726867184.05854: done getting variables 12033 1726867184.05916: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867184.06020: variable 'profile' from source: include params 12033 1726867184.06024: variable 'bond_port_profile' from source: include params 12033 1726867184.06087: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:19:44 -0400 (0:00:00.063) 0:00:23.176 ****** 12033 1726867184.06120: entering _queue_task() for managed_node3/set_fact 12033 1726867184.06601: worker is 1 (out of 1 available) 12033 1726867184.06611: exiting _queue_task() for managed_node3/set_fact 12033 1726867184.06621: done queuing things up, now waiting for results queue to drain 12033 1726867184.06623: waiting for pending results... 12033 1726867184.06783: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 12033 1726867184.06931: in run() - task 0affcac9-a3a5-74bb-502b-0000000005bd 12033 1726867184.06958: variable 'ansible_search_path' from source: unknown 12033 1726867184.06968: variable 'ansible_search_path' from source: unknown 12033 1726867184.07101: calling self._execute() 12033 1726867184.07373: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.07483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.07487: variable 'omit' from source: magic vars 12033 1726867184.07809: variable 'ansible_distribution_major_version' from source: facts 12033 1726867184.07828: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867184.07955: variable 'profile_stat' from source: set_fact 12033 1726867184.07973: Evaluated conditional (profile_stat.stat.exists): False 12033 1726867184.07984: when evaluation is False, skipping this task 12033 1726867184.07996: _execute() done 12033 1726867184.08003: dumping result to json 12033 1726867184.08011: done dumping result, returning 12033 1726867184.08027: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [0affcac9-a3a5-74bb-502b-0000000005bd] 12033 1726867184.08040: sending task result for task 0affcac9-a3a5-74bb-502b-0000000005bd skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12033 1726867184.08193: no more pending results, returning what we have 12033 1726867184.08197: results queue empty 12033 1726867184.08199: checking for any_errors_fatal 12033 1726867184.08205: done checking for any_errors_fatal 12033 1726867184.08206: checking for max_fail_percentage 12033 1726867184.08208: done checking for max_fail_percentage 12033 1726867184.08209: checking to see if all hosts have failed and the running result is not ok 12033 1726867184.08210: done checking to see if all hosts have failed 12033 1726867184.08211: getting the remaining hosts for this loop 12033 1726867184.08212: done getting the remaining hosts for this loop 12033 1726867184.08216: getting the next task for host managed_node3 12033 1726867184.08234: done getting next task for host managed_node3 12033 1726867184.08237: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 12033 1726867184.08244: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867184.08248: getting variables 12033 1726867184.08250: in VariableManager get_vars() 12033 1726867184.08282: Calling all_inventory to load vars for managed_node3 12033 1726867184.08285: Calling groups_inventory to load vars for managed_node3 12033 1726867184.08289: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867184.08304: Calling all_plugins_play to load vars for managed_node3 12033 1726867184.08307: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867184.08310: Calling groups_plugins_play to load vars for managed_node3 12033 1726867184.09193: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000005bd 12033 1726867184.09197: WORKER PROCESS EXITING 12033 1726867184.10440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867184.12150: done with get_vars() 12033 1726867184.12170: done getting variables 12033 1726867184.12233: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867184.12346: variable 'profile' from source: include params 12033 1726867184.12350: variable 'bond_port_profile' from source: include params 12033 1726867184.12411: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 17:19:44 -0400 (0:00:00.063) 0:00:23.240 ****** 12033 1726867184.12445: entering _queue_task() for managed_node3/assert 12033 1726867184.12741: worker is 1 (out of 1 available) 12033 1726867184.12753: exiting _queue_task() for managed_node3/assert 12033 1726867184.12764: done queuing things up, now waiting for results queue to drain 12033 1726867184.12765: waiting for pending results... 12033 1726867184.13050: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' 12033 1726867184.13173: in run() - task 0affcac9-a3a5-74bb-502b-0000000004e8 12033 1726867184.13201: variable 'ansible_search_path' from source: unknown 12033 1726867184.13211: variable 'ansible_search_path' from source: unknown 12033 1726867184.13250: calling self._execute() 12033 1726867184.13347: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.13358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.13370: variable 'omit' from source: magic vars 12033 1726867184.13724: variable 'ansible_distribution_major_version' from source: facts 12033 1726867184.13740: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867184.13755: variable 'omit' from source: magic vars 12033 1726867184.13813: variable 'omit' from source: magic vars 12033 1726867184.13924: variable 'profile' from source: include params 12033 1726867184.13934: variable 'bond_port_profile' from source: include params 12033 1726867184.14009: variable 'bond_port_profile' from source: include params 12033 1726867184.14033: variable 'omit' from source: magic vars 12033 1726867184.14076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867184.14121: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867184.14146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867184.14168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867184.14185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867184.14226: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867184.14236: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.14243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.14363: Set connection var ansible_pipelining to False 12033 1726867184.14379: Set connection var ansible_shell_executable to /bin/sh 12033 1726867184.14412: Set connection var ansible_timeout to 10 12033 1726867184.14415: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867184.14417: Set connection var ansible_connection to ssh 12033 1726867184.14420: Set connection var ansible_shell_type to sh 12033 1726867184.14443: variable 'ansible_shell_executable' from source: unknown 12033 1726867184.14522: variable 'ansible_connection' from source: unknown 12033 1726867184.14525: variable 'ansible_module_compression' from source: unknown 12033 1726867184.14527: variable 'ansible_shell_type' from source: unknown 12033 1726867184.14529: variable 'ansible_shell_executable' from source: unknown 12033 1726867184.14531: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.14533: variable 'ansible_pipelining' from source: unknown 12033 1726867184.14535: variable 'ansible_timeout' from source: unknown 12033 1726867184.14537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.14646: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867184.14665: variable 'omit' from source: magic vars 12033 1726867184.14675: starting attempt loop 12033 1726867184.14685: running the handler 12033 1726867184.14803: variable 'lsr_net_profile_exists' from source: set_fact 12033 1726867184.14813: Evaluated conditional (lsr_net_profile_exists): True 12033 1726867184.14823: handler run complete 12033 1726867184.14846: attempt loop complete, returning result 12033 1726867184.14858: _execute() done 12033 1726867184.14963: dumping result to json 12033 1726867184.14967: done dumping result, returning 12033 1726867184.14969: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' [0affcac9-a3a5-74bb-502b-0000000004e8] 12033 1726867184.14971: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004e8 12033 1726867184.15039: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004e8 12033 1726867184.15042: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12033 1726867184.15115: no more pending results, returning what we have 12033 1726867184.15119: results queue empty 12033 1726867184.15120: checking for any_errors_fatal 12033 1726867184.15127: done checking for any_errors_fatal 12033 1726867184.15128: checking for max_fail_percentage 12033 1726867184.15130: done checking for max_fail_percentage 12033 1726867184.15132: checking to see if all hosts have failed and the running result is not ok 12033 1726867184.15133: done checking to see if all hosts have failed 12033 1726867184.15133: getting the remaining hosts for this loop 12033 1726867184.15136: done getting the remaining hosts for this loop 12033 1726867184.15139: getting the next task for host managed_node3 12033 1726867184.15147: done getting next task for host managed_node3 12033 1726867184.15151: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 12033 1726867184.15156: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867184.15160: getting variables 12033 1726867184.15161: in VariableManager get_vars() 12033 1726867184.15196: Calling all_inventory to load vars for managed_node3 12033 1726867184.15199: Calling groups_inventory to load vars for managed_node3 12033 1726867184.15202: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867184.15213: Calling all_plugins_play to load vars for managed_node3 12033 1726867184.15216: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867184.15219: Calling groups_plugins_play to load vars for managed_node3 12033 1726867184.17942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867184.20349: done with get_vars() 12033 1726867184.20375: done getting variables 12033 1726867184.20437: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867184.20561: variable 'profile' from source: include params 12033 1726867184.20565: variable 'bond_port_profile' from source: include params 12033 1726867184.20628: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 17:19:44 -0400 (0:00:00.082) 0:00:23.322 ****** 12033 1726867184.20663: entering _queue_task() for managed_node3/assert 12033 1726867184.21002: worker is 1 (out of 1 available) 12033 1726867184.21014: exiting _queue_task() for managed_node3/assert 12033 1726867184.21027: done queuing things up, now waiting for results queue to drain 12033 1726867184.21028: waiting for pending results... 12033 1726867184.21316: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' 12033 1726867184.21485: in run() - task 0affcac9-a3a5-74bb-502b-0000000004e9 12033 1726867184.21492: variable 'ansible_search_path' from source: unknown 12033 1726867184.21495: variable 'ansible_search_path' from source: unknown 12033 1726867184.21518: calling self._execute() 12033 1726867184.21616: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.21683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.21687: variable 'omit' from source: magic vars 12033 1726867184.22003: variable 'ansible_distribution_major_version' from source: facts 12033 1726867184.22019: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867184.22028: variable 'omit' from source: magic vars 12033 1726867184.22085: variable 'omit' from source: magic vars 12033 1726867184.22484: variable 'profile' from source: include params 12033 1726867184.22488: variable 'bond_port_profile' from source: include params 12033 1726867184.22493: variable 'bond_port_profile' from source: include params 12033 1726867184.22500: variable 'omit' from source: magic vars 12033 1726867184.22545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867184.22633: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867184.22722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867184.22744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867184.22823: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867184.22863: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867184.22924: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.22933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.23153: Set connection var ansible_pipelining to False 12033 1726867184.23167: Set connection var ansible_shell_executable to /bin/sh 12033 1726867184.23181: Set connection var ansible_timeout to 10 12033 1726867184.23193: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867184.23201: Set connection var ansible_connection to ssh 12033 1726867184.23281: Set connection var ansible_shell_type to sh 12033 1726867184.23285: variable 'ansible_shell_executable' from source: unknown 12033 1726867184.23461: variable 'ansible_connection' from source: unknown 12033 1726867184.23465: variable 'ansible_module_compression' from source: unknown 12033 1726867184.23467: variable 'ansible_shell_type' from source: unknown 12033 1726867184.23469: variable 'ansible_shell_executable' from source: unknown 12033 1726867184.23471: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.23473: variable 'ansible_pipelining' from source: unknown 12033 1726867184.23476: variable 'ansible_timeout' from source: unknown 12033 1726867184.23480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.23631: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867184.23698: variable 'omit' from source: magic vars 12033 1726867184.23896: starting attempt loop 12033 1726867184.23900: running the handler 12033 1726867184.24000: variable 'lsr_net_profile_ansible_managed' from source: set_fact 12033 1726867184.24082: Evaluated conditional (lsr_net_profile_ansible_managed): True 12033 1726867184.24085: handler run complete 12033 1726867184.24088: attempt loop complete, returning result 12033 1726867184.24092: _execute() done 12033 1726867184.24094: dumping result to json 12033 1726867184.24097: done dumping result, returning 12033 1726867184.24099: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' [0affcac9-a3a5-74bb-502b-0000000004e9] 12033 1726867184.24101: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004e9 12033 1726867184.24403: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004e9 12033 1726867184.24407: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12033 1726867184.24487: no more pending results, returning what we have 12033 1726867184.24493: results queue empty 12033 1726867184.24494: checking for any_errors_fatal 12033 1726867184.24503: done checking for any_errors_fatal 12033 1726867184.24504: checking for max_fail_percentage 12033 1726867184.24506: done checking for max_fail_percentage 12033 1726867184.24507: checking to see if all hosts have failed and the running result is not ok 12033 1726867184.24508: done checking to see if all hosts have failed 12033 1726867184.24509: getting the remaining hosts for this loop 12033 1726867184.24511: done getting the remaining hosts for this loop 12033 1726867184.24514: getting the next task for host managed_node3 12033 1726867184.24523: done getting next task for host managed_node3 12033 1726867184.24525: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 12033 1726867184.24530: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867184.24534: getting variables 12033 1726867184.24536: in VariableManager get_vars() 12033 1726867184.24568: Calling all_inventory to load vars for managed_node3 12033 1726867184.24572: Calling groups_inventory to load vars for managed_node3 12033 1726867184.24575: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867184.24588: Calling all_plugins_play to load vars for managed_node3 12033 1726867184.24595: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867184.24599: Calling groups_plugins_play to load vars for managed_node3 12033 1726867184.27137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867184.28841: done with get_vars() 12033 1726867184.28861: done getting variables 12033 1726867184.28920: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867184.29036: variable 'profile' from source: include params 12033 1726867184.29040: variable 'bond_port_profile' from source: include params 12033 1726867184.29101: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 17:19:44 -0400 (0:00:00.084) 0:00:23.407 ****** 12033 1726867184.29135: entering _queue_task() for managed_node3/assert 12033 1726867184.29451: worker is 1 (out of 1 available) 12033 1726867184.29464: exiting _queue_task() for managed_node3/assert 12033 1726867184.29475: done queuing things up, now waiting for results queue to drain 12033 1726867184.29476: waiting for pending results... 12033 1726867184.29934: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 12033 1726867184.30199: in run() - task 0affcac9-a3a5-74bb-502b-0000000004ea 12033 1726867184.30202: variable 'ansible_search_path' from source: unknown 12033 1726867184.30205: variable 'ansible_search_path' from source: unknown 12033 1726867184.30207: calling self._execute() 12033 1726867184.30248: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.30259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.30271: variable 'omit' from source: magic vars 12033 1726867184.30639: variable 'ansible_distribution_major_version' from source: facts 12033 1726867184.30659: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867184.30669: variable 'omit' from source: magic vars 12033 1726867184.30729: variable 'omit' from source: magic vars 12033 1726867184.30852: variable 'profile' from source: include params 12033 1726867184.30869: variable 'bond_port_profile' from source: include params 12033 1726867184.30959: variable 'bond_port_profile' from source: include params 12033 1726867184.30994: variable 'omit' from source: magic vars 12033 1726867184.31035: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867184.31194: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867184.31198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867184.31200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867184.31202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867184.31204: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867184.31206: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.31208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.31355: Set connection var ansible_pipelining to False 12033 1726867184.31369: Set connection var ansible_shell_executable to /bin/sh 12033 1726867184.31383: Set connection var ansible_timeout to 10 12033 1726867184.31395: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867184.31401: Set connection var ansible_connection to ssh 12033 1726867184.31410: Set connection var ansible_shell_type to sh 12033 1726867184.31444: variable 'ansible_shell_executable' from source: unknown 12033 1726867184.31452: variable 'ansible_connection' from source: unknown 12033 1726867184.31459: variable 'ansible_module_compression' from source: unknown 12033 1726867184.31528: variable 'ansible_shell_type' from source: unknown 12033 1726867184.31531: variable 'ansible_shell_executable' from source: unknown 12033 1726867184.31533: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.31535: variable 'ansible_pipelining' from source: unknown 12033 1726867184.31538: variable 'ansible_timeout' from source: unknown 12033 1726867184.31540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.31639: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867184.31657: variable 'omit' from source: magic vars 12033 1726867184.31668: starting attempt loop 12033 1726867184.31674: running the handler 12033 1726867184.31787: variable 'lsr_net_profile_fingerprint' from source: set_fact 12033 1726867184.31801: Evaluated conditional (lsr_net_profile_fingerprint): True 12033 1726867184.31812: handler run complete 12033 1726867184.31830: attempt loop complete, returning result 12033 1726867184.31837: _execute() done 12033 1726867184.31853: dumping result to json 12033 1726867184.31856: done dumping result, returning 12033 1726867184.31862: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 [0affcac9-a3a5-74bb-502b-0000000004ea] 12033 1726867184.31963: sending task result for task 0affcac9-a3a5-74bb-502b-0000000004ea 12033 1726867184.32031: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000004ea 12033 1726867184.32034: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12033 1726867184.32115: no more pending results, returning what we have 12033 1726867184.32119: results queue empty 12033 1726867184.32120: checking for any_errors_fatal 12033 1726867184.32130: done checking for any_errors_fatal 12033 1726867184.32131: checking for max_fail_percentage 12033 1726867184.32133: done checking for max_fail_percentage 12033 1726867184.32134: checking to see if all hosts have failed and the running result is not ok 12033 1726867184.32135: done checking to see if all hosts have failed 12033 1726867184.32136: getting the remaining hosts for this loop 12033 1726867184.32137: done getting the remaining hosts for this loop 12033 1726867184.32140: getting the next task for host managed_node3 12033 1726867184.32152: done getting next task for host managed_node3 12033 1726867184.32154: ^ task is: TASK: ** TEST check bond settings 12033 1726867184.32158: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867184.32162: getting variables 12033 1726867184.32164: in VariableManager get_vars() 12033 1726867184.32396: Calling all_inventory to load vars for managed_node3 12033 1726867184.32399: Calling groups_inventory to load vars for managed_node3 12033 1726867184.32403: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867184.32411: Calling all_plugins_play to load vars for managed_node3 12033 1726867184.32414: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867184.32417: Calling groups_plugins_play to load vars for managed_node3 12033 1726867184.34070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867184.35952: done with get_vars() 12033 1726867184.35974: done getting variables 12033 1726867184.36074: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check bond settings] ********************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Friday 20 September 2024 17:19:44 -0400 (0:00:00.069) 0:00:23.476 ****** 12033 1726867184.36123: entering _queue_task() for managed_node3/command 12033 1726867184.36513: worker is 1 (out of 1 available) 12033 1726867184.36525: exiting _queue_task() for managed_node3/command 12033 1726867184.36538: done queuing things up, now waiting for results queue to drain 12033 1726867184.36539: waiting for pending results... 12033 1726867184.36859: running TaskExecutor() for managed_node3/TASK: ** TEST check bond settings 12033 1726867184.37184: in run() - task 0affcac9-a3a5-74bb-502b-000000000400 12033 1726867184.37188: variable 'ansible_search_path' from source: unknown 12033 1726867184.37194: variable 'ansible_search_path' from source: unknown 12033 1726867184.37197: variable 'bond_options_to_assert' from source: play vars 12033 1726867184.37335: variable 'bond_options_to_assert' from source: play vars 12033 1726867184.37556: variable 'omit' from source: magic vars 12033 1726867184.37949: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.38182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.38186: variable 'omit' from source: magic vars 12033 1726867184.38715: variable 'ansible_distribution_major_version' from source: facts 12033 1726867184.38731: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867184.38742: variable 'omit' from source: magic vars 12033 1726867184.39284: variable 'omit' from source: magic vars 12033 1726867184.39408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867184.43258: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867184.43360: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867184.43463: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867184.43510: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867184.43623: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867184.43946: variable 'controller_device' from source: play vars 12033 1726867184.43959: variable 'bond_opt' from source: unknown 12033 1726867184.44020: variable 'omit' from source: magic vars 12033 1726867184.44228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867184.44232: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867184.44234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867184.44237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867184.44446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867184.44449: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867184.44452: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.44454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.44604: Set connection var ansible_pipelining to False 12033 1726867184.44619: Set connection var ansible_shell_executable to /bin/sh 12033 1726867184.44673: Set connection var ansible_timeout to 10 12033 1726867184.44687: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867184.44697: Set connection var ansible_connection to ssh 12033 1726867184.44707: Set connection var ansible_shell_type to sh 12033 1726867184.44736: variable 'ansible_shell_executable' from source: unknown 12033 1726867184.44780: variable 'ansible_connection' from source: unknown 12033 1726867184.44811: variable 'ansible_module_compression' from source: unknown 12033 1726867184.44834: variable 'ansible_shell_type' from source: unknown 12033 1726867184.44889: variable 'ansible_shell_executable' from source: unknown 12033 1726867184.44900: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.44908: variable 'ansible_pipelining' from source: unknown 12033 1726867184.44915: variable 'ansible_timeout' from source: unknown 12033 1726867184.44934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.45158: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867184.45244: variable 'omit' from source: magic vars 12033 1726867184.45273: starting attempt loop 12033 1726867184.45282: running the handler 12033 1726867184.45306: _low_level_execute_command(): starting 12033 1726867184.45321: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867184.46034: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867184.46055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867184.46083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867184.46179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867184.46202: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867184.46233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867184.46410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867184.48002: stdout chunk (state=3): >>>/root <<< 12033 1726867184.48217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867184.48229: stdout chunk (state=3): >>><<< 12033 1726867184.48243: stderr chunk (state=3): >>><<< 12033 1726867184.48274: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867184.48575: _low_level_execute_command(): starting 12033 1726867184.48583: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867184.4848406-13156-256969393543572 `" && echo ansible-tmp-1726867184.4848406-13156-256969393543572="` echo /root/.ansible/tmp/ansible-tmp-1726867184.4848406-13156-256969393543572 `" ) && sleep 0' 12033 1726867184.49731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867184.49794: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867184.49852: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867184.49865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867184.49950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867184.50024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867184.51933: stdout chunk (state=3): >>>ansible-tmp-1726867184.4848406-13156-256969393543572=/root/.ansible/tmp/ansible-tmp-1726867184.4848406-13156-256969393543572 <<< 12033 1726867184.52076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867184.52082: stdout chunk (state=3): >>><<< 12033 1726867184.52093: stderr chunk (state=3): >>><<< 12033 1726867184.52225: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867184.4848406-13156-256969393543572=/root/.ansible/tmp/ansible-tmp-1726867184.4848406-13156-256969393543572 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867184.52257: variable 'ansible_module_compression' from source: unknown 12033 1726867184.52307: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867184.52482: variable 'ansible_facts' from source: unknown 12033 1726867184.52486: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867184.4848406-13156-256969393543572/AnsiballZ_command.py 12033 1726867184.52840: Sending initial data 12033 1726867184.52844: Sent initial data (156 bytes) 12033 1726867184.54086: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867184.54126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867184.54130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867184.54196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867184.54202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867184.55744: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 12033 1726867184.55753: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867184.55798: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867184.55928: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpmgr9_dt6 /root/.ansible/tmp/ansible-tmp-1726867184.4848406-13156-256969393543572/AnsiballZ_command.py <<< 12033 1726867184.55937: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867184.4848406-13156-256969393543572/AnsiballZ_command.py" <<< 12033 1726867184.55973: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpmgr9_dt6" to remote "/root/.ansible/tmp/ansible-tmp-1726867184.4848406-13156-256969393543572/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867184.4848406-13156-256969393543572/AnsiballZ_command.py" <<< 12033 1726867184.57239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867184.57307: stderr chunk (state=3): >>><<< 12033 1726867184.57310: stdout chunk (state=3): >>><<< 12033 1726867184.57338: done transferring module to remote 12033 1726867184.57349: _low_level_execute_command(): starting 12033 1726867184.57352: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867184.4848406-13156-256969393543572/ /root/.ansible/tmp/ansible-tmp-1726867184.4848406-13156-256969393543572/AnsiballZ_command.py && sleep 0' 12033 1726867184.58583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867184.58594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867184.58597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867184.58601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867184.58603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867184.58753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867184.58757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867184.58774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867184.58948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867184.60751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867184.60755: stderr chunk (state=3): >>><<< 12033 1726867184.60758: stdout chunk (state=3): >>><<< 12033 1726867184.60776: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867184.60782: _low_level_execute_command(): starting 12033 1726867184.60812: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867184.4848406-13156-256969393543572/AnsiballZ_command.py && sleep 0' 12033 1726867184.62271: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867184.62274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867184.62278: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867184.62281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867184.62283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867184.62285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867184.62304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867184.62318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867184.62388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867184.77873: stdout chunk (state=3): >>> {"changed": true, "stdout": "802.3ad 4", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-20 17:19:44.772699", "end": "2024-09-20 17:19:44.775625", "delta": "0:00:00.002926", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867184.79274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867184.79481: stdout chunk (state=3): >>><<< 12033 1726867184.79485: stderr chunk (state=3): >>><<< 12033 1726867184.79488: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "802.3ad 4", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-20 17:19:44.772699", "end": "2024-09-20 17:19:44.775625", "delta": "0:00:00.002926", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867184.79500: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/mode', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867184.4848406-13156-256969393543572/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867184.79508: _low_level_execute_command(): starting 12033 1726867184.79510: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867184.4848406-13156-256969393543572/ > /dev/null 2>&1 && sleep 0' 12033 1726867184.80542: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867184.80656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867184.80754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867184.80757: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867184.80873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867184.82985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867184.82989: stdout chunk (state=3): >>><<< 12033 1726867184.82994: stderr chunk (state=3): >>><<< 12033 1726867184.82997: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867184.82999: handler run complete 12033 1726867184.83001: Evaluated conditional (False): False 12033 1726867184.83189: variable 'bond_opt' from source: unknown 12033 1726867184.83314: variable 'result' from source: unknown 12033 1726867184.83318: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867184.83332: attempt loop complete, returning result 12033 1726867184.83353: variable 'bond_opt' from source: unknown 12033 1726867184.83468: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'mode', 'value': '802.3ad'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "mode", "value": "802.3ad" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/mode" ], "delta": "0:00:00.002926", "end": "2024-09-20 17:19:44.775625", "rc": 0, "start": "2024-09-20 17:19:44.772699" } STDOUT: 802.3ad 4 12033 1726867184.84083: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.84087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.84092: variable 'omit' from source: magic vars 12033 1726867184.84197: variable 'ansible_distribution_major_version' from source: facts 12033 1726867184.84384: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867184.84387: variable 'omit' from source: magic vars 12033 1726867184.84392: variable 'omit' from source: magic vars 12033 1726867184.84659: variable 'controller_device' from source: play vars 12033 1726867184.84670: variable 'bond_opt' from source: unknown 12033 1726867184.84783: variable 'omit' from source: magic vars 12033 1726867184.84861: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867184.84865: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867184.84868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867184.84871: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867184.84873: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.84875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.85036: Set connection var ansible_pipelining to False 12033 1726867184.85110: Set connection var ansible_shell_executable to /bin/sh 12033 1726867184.85123: Set connection var ansible_timeout to 10 12033 1726867184.85133: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867184.85139: Set connection var ansible_connection to ssh 12033 1726867184.85212: Set connection var ansible_shell_type to sh 12033 1726867184.85643: variable 'ansible_shell_executable' from source: unknown 12033 1726867184.85646: variable 'ansible_connection' from source: unknown 12033 1726867184.85649: variable 'ansible_module_compression' from source: unknown 12033 1726867184.85651: variable 'ansible_shell_type' from source: unknown 12033 1726867184.85653: variable 'ansible_shell_executable' from source: unknown 12033 1726867184.85655: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867184.85657: variable 'ansible_pipelining' from source: unknown 12033 1726867184.85659: variable 'ansible_timeout' from source: unknown 12033 1726867184.85661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867184.85682: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867184.85783: variable 'omit' from source: magic vars 12033 1726867184.85786: starting attempt loop 12033 1726867184.85788: running the handler 12033 1726867184.85793: _low_level_execute_command(): starting 12033 1726867184.85796: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867184.86999: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867184.87108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867184.87126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867184.87214: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867184.87342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867184.87413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867184.87494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867184.89072: stdout chunk (state=3): >>>/root <<< 12033 1726867184.89176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867184.89375: stdout chunk (state=3): >>><<< 12033 1726867184.89386: stderr chunk (state=3): >>><<< 12033 1726867184.89389: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867184.89396: _low_level_execute_command(): starting 12033 1726867184.89398: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867184.8928719-13156-137074914388608 `" && echo ansible-tmp-1726867184.8928719-13156-137074914388608="` echo /root/.ansible/tmp/ansible-tmp-1726867184.8928719-13156-137074914388608 `" ) && sleep 0' 12033 1726867184.90623: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867184.90771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867184.90833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867184.90875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867184.92764: stdout chunk (state=3): >>>ansible-tmp-1726867184.8928719-13156-137074914388608=/root/.ansible/tmp/ansible-tmp-1726867184.8928719-13156-137074914388608 <<< 12033 1726867184.92878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867184.92898: stderr chunk (state=3): >>><<< 12033 1726867184.92932: stdout chunk (state=3): >>><<< 12033 1726867184.92959: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867184.8928719-13156-137074914388608=/root/.ansible/tmp/ansible-tmp-1726867184.8928719-13156-137074914388608 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867184.93168: variable 'ansible_module_compression' from source: unknown 12033 1726867184.93170: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867184.93173: variable 'ansible_facts' from source: unknown 12033 1726867184.93308: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867184.8928719-13156-137074914388608/AnsiballZ_command.py 12033 1726867184.93801: Sending initial data 12033 1726867184.93804: Sent initial data (156 bytes) 12033 1726867184.94274: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867184.94290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867184.94307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867184.94325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867184.94372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867184.94481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867184.94506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867184.94537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867184.94582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867184.96116: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867184.96180: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867184.96236: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpy54gwjyo /root/.ansible/tmp/ansible-tmp-1726867184.8928719-13156-137074914388608/AnsiballZ_command.py <<< 12033 1726867184.96240: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867184.8928719-13156-137074914388608/AnsiballZ_command.py" <<< 12033 1726867184.96292: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpy54gwjyo" to remote "/root/.ansible/tmp/ansible-tmp-1726867184.8928719-13156-137074914388608/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867184.8928719-13156-137074914388608/AnsiballZ_command.py" <<< 12033 1726867184.97135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867184.97138: stdout chunk (state=3): >>><<< 12033 1726867184.97140: stderr chunk (state=3): >>><<< 12033 1726867184.97142: done transferring module to remote 12033 1726867184.97144: _low_level_execute_command(): starting 12033 1726867184.97147: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867184.8928719-13156-137074914388608/ /root/.ansible/tmp/ansible-tmp-1726867184.8928719-13156-137074914388608/AnsiballZ_command.py && sleep 0' 12033 1726867184.97887: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867184.97894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867184.97898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867184.97900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867184.97903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867184.97905: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867184.97907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867184.97909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867184.97911: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867184.97912: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867184.97914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867184.97916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867184.97971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867184.97998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867184.98003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867184.98070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867184.99840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867184.99845: stdout chunk (state=3): >>><<< 12033 1726867184.99847: stderr chunk (state=3): >>><<< 12033 1726867184.99864: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867184.99868: _low_level_execute_command(): starting 12033 1726867184.99870: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867184.8928719-13156-137074914388608/AnsiballZ_command.py && sleep 0' 12033 1726867185.00396: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867185.00399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867185.00412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867185.00483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867185.00486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867185.00489: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867185.00494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867185.00497: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867185.00499: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867185.00501: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867185.00503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867185.00506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867185.00510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867185.00513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867185.00515: stderr chunk (state=3): >>>debug2: match found <<< 12033 1726867185.00523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867185.00617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867185.00620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867185.00671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867185.00774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867185.16266: stdout chunk (state=3): >>> {"changed": true, "stdout": "65535", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio"], "start": "2024-09-20 17:19:45.157064", "end": "2024-09-20 17:19:45.160297", "delta": "0:00:00.003233", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867185.17950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867185.17959: stdout chunk (state=3): >>><<< 12033 1726867185.17983: stderr chunk (state=3): >>><<< 12033 1726867185.18167: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "65535", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio"], "start": "2024-09-20 17:19:45.157064", "end": "2024-09-20 17:19:45.160297", "delta": "0:00:00.003233", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867185.18171: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867184.8928719-13156-137074914388608/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867185.18173: _low_level_execute_command(): starting 12033 1726867185.18175: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867184.8928719-13156-137074914388608/ > /dev/null 2>&1 && sleep 0' 12033 1726867185.19694: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867185.19717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867185.19748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867185.19793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867185.21653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867185.21674: stderr chunk (state=3): >>><<< 12033 1726867185.21686: stdout chunk (state=3): >>><<< 12033 1726867185.21862: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867185.21865: handler run complete 12033 1726867185.21867: Evaluated conditional (False): False 12033 1726867185.22056: variable 'bond_opt' from source: unknown 12033 1726867185.22294: variable 'result' from source: unknown 12033 1726867185.22297: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867185.22299: attempt loop complete, returning result 12033 1726867185.22301: variable 'bond_opt' from source: unknown 12033 1726867185.22413: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'ad_actor_sys_prio', 'value': '65535'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_actor_sys_prio", "value": "65535" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio" ], "delta": "0:00:00.003233", "end": "2024-09-20 17:19:45.160297", "rc": 0, "start": "2024-09-20 17:19:45.157064" } STDOUT: 65535 12033 1726867185.23085: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867185.23088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867185.23091: variable 'omit' from source: magic vars 12033 1726867185.23109: variable 'ansible_distribution_major_version' from source: facts 12033 1726867185.23120: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867185.23128: variable 'omit' from source: magic vars 12033 1726867185.23146: variable 'omit' from source: magic vars 12033 1726867185.23440: variable 'controller_device' from source: play vars 12033 1726867185.23449: variable 'bond_opt' from source: unknown 12033 1726867185.23470: variable 'omit' from source: magic vars 12033 1726867185.23542: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867185.23557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867185.23737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867185.23745: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867185.23747: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867185.23749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867185.23751: Set connection var ansible_pipelining to False 12033 1726867185.23856: Set connection var ansible_shell_executable to /bin/sh 12033 1726867185.23868: Set connection var ansible_timeout to 10 12033 1726867185.23879: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867185.23886: Set connection var ansible_connection to ssh 12033 1726867185.23896: Set connection var ansible_shell_type to sh 12033 1726867185.23922: variable 'ansible_shell_executable' from source: unknown 12033 1726867185.23960: variable 'ansible_connection' from source: unknown 12033 1726867185.23988: variable 'ansible_module_compression' from source: unknown 12033 1726867185.23996: variable 'ansible_shell_type' from source: unknown 12033 1726867185.24003: variable 'ansible_shell_executable' from source: unknown 12033 1726867185.24009: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867185.24172: variable 'ansible_pipelining' from source: unknown 12033 1726867185.24175: variable 'ansible_timeout' from source: unknown 12033 1726867185.24180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867185.24182: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867185.24288: variable 'omit' from source: magic vars 12033 1726867185.24297: starting attempt loop 12033 1726867185.24304: running the handler 12033 1726867185.24316: _low_level_execute_command(): starting 12033 1726867185.24324: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867185.25562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867185.25694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867185.25775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867185.25871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867185.25898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867185.25971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867185.27599: stdout chunk (state=3): >>>/root <<< 12033 1726867185.27752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867185.27755: stdout chunk (state=3): >>><<< 12033 1726867185.27757: stderr chunk (state=3): >>><<< 12033 1726867185.27772: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867185.27892: _low_level_execute_command(): starting 12033 1726867185.27896: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867185.2781618-13156-248478153165796 `" && echo ansible-tmp-1726867185.2781618-13156-248478153165796="` echo /root/.ansible/tmp/ansible-tmp-1726867185.2781618-13156-248478153165796 `" ) && sleep 0' 12033 1726867185.29121: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867185.29154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867185.29181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867185.29241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867185.29362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867185.31240: stdout chunk (state=3): >>>ansible-tmp-1726867185.2781618-13156-248478153165796=/root/.ansible/tmp/ansible-tmp-1726867185.2781618-13156-248478153165796 <<< 12033 1726867185.31408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867185.31411: stdout chunk (state=3): >>><<< 12033 1726867185.31414: stderr chunk (state=3): >>><<< 12033 1726867185.31430: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867185.2781618-13156-248478153165796=/root/.ansible/tmp/ansible-tmp-1726867185.2781618-13156-248478153165796 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867185.31584: variable 'ansible_module_compression' from source: unknown 12033 1726867185.31586: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867185.31604: variable 'ansible_facts' from source: unknown 12033 1726867185.31843: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867185.2781618-13156-248478153165796/AnsiballZ_command.py 12033 1726867185.32048: Sending initial data 12033 1726867185.32058: Sent initial data (156 bytes) 12033 1726867185.33358: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867185.33512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867185.33550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867185.33597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867185.35118: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867185.35158: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867185.35205: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmp891zorhc /root/.ansible/tmp/ansible-tmp-1726867185.2781618-13156-248478153165796/AnsiballZ_command.py <<< 12033 1726867185.35208: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867185.2781618-13156-248478153165796/AnsiballZ_command.py" <<< 12033 1726867185.35326: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmp891zorhc" to remote "/root/.ansible/tmp/ansible-tmp-1726867185.2781618-13156-248478153165796/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867185.2781618-13156-248478153165796/AnsiballZ_command.py" <<< 12033 1726867185.36887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867185.36891: stdout chunk (state=3): >>><<< 12033 1726867185.36893: stderr chunk (state=3): >>><<< 12033 1726867185.36895: done transferring module to remote 12033 1726867185.36897: _low_level_execute_command(): starting 12033 1726867185.37025: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867185.2781618-13156-248478153165796/ /root/.ansible/tmp/ansible-tmp-1726867185.2781618-13156-248478153165796/AnsiballZ_command.py && sleep 0' 12033 1726867185.37867: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867185.37902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867185.37916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867185.38008: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867185.38023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867185.38037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867185.38104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867185.39973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867185.39998: stderr chunk (state=3): >>><<< 12033 1726867185.40008: stdout chunk (state=3): >>><<< 12033 1726867185.40030: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867185.40040: _low_level_execute_command(): starting 12033 1726867185.40048: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867185.2781618-13156-248478153165796/AnsiballZ_command.py && sleep 0' 12033 1726867185.40797: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867185.40845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867185.40922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867185.40947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867185.40967: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867185.41084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867185.56372: stdout chunk (state=3): >>> {"changed": true, "stdout": "00:00:5e:00:53:5d", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_system"], "start": "2024-09-20 17:19:45.558319", "end": "2024-09-20 17:19:45.561365", "delta": "0:00:00.003046", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_system", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867185.57884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867185.57888: stdout chunk (state=3): >>><<< 12033 1726867185.57890: stderr chunk (state=3): >>><<< 12033 1726867185.57898: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "00:00:5e:00:53:5d", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_system"], "start": "2024-09-20 17:19:45.558319", "end": "2024-09-20 17:19:45.561365", "delta": "0:00:00.003046", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_system", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867185.57925: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_actor_system', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867185.2781618-13156-248478153165796/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867185.57928: _low_level_execute_command(): starting 12033 1726867185.57930: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867185.2781618-13156-248478153165796/ > /dev/null 2>&1 && sleep 0' 12033 1726867185.58582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867185.58590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867185.58593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867185.58595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867185.58706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867185.58711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867185.60620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867185.60623: stderr chunk (state=3): >>><<< 12033 1726867185.60626: stdout chunk (state=3): >>><<< 12033 1726867185.60646: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867185.60650: handler run complete 12033 1726867185.60792: Evaluated conditional (False): False 12033 1726867185.60862: variable 'bond_opt' from source: unknown 12033 1726867185.60871: variable 'result' from source: unknown 12033 1726867185.60893: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867185.60907: attempt loop complete, returning result 12033 1726867185.60926: variable 'bond_opt' from source: unknown 12033 1726867185.61062: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'ad_actor_system', 'value': '00:00:5e:00:53:5d'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_actor_system", "value": "00:00:5e:00:53:5d" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_actor_system" ], "delta": "0:00:00.003046", "end": "2024-09-20 17:19:45.561365", "rc": 0, "start": "2024-09-20 17:19:45.558319" } STDOUT: 00:00:5e:00:53:5d 12033 1726867185.61251: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867185.61254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867185.61256: variable 'omit' from source: magic vars 12033 1726867185.61408: variable 'ansible_distribution_major_version' from source: facts 12033 1726867185.61411: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867185.61416: variable 'omit' from source: magic vars 12033 1726867185.61428: variable 'omit' from source: magic vars 12033 1726867185.61544: variable 'controller_device' from source: play vars 12033 1726867185.61560: variable 'bond_opt' from source: unknown 12033 1726867185.61583: variable 'omit' from source: magic vars 12033 1726867185.61601: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867185.61608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867185.61614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867185.61623: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867185.61626: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867185.61628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867185.61683: Set connection var ansible_pipelining to False 12033 1726867185.61689: Set connection var ansible_shell_executable to /bin/sh 12033 1726867185.61700: Set connection var ansible_timeout to 10 12033 1726867185.61702: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867185.61705: Set connection var ansible_connection to ssh 12033 1726867185.61710: Set connection var ansible_shell_type to sh 12033 1726867185.61728: variable 'ansible_shell_executable' from source: unknown 12033 1726867185.61731: variable 'ansible_connection' from source: unknown 12033 1726867185.61733: variable 'ansible_module_compression' from source: unknown 12033 1726867185.61735: variable 'ansible_shell_type' from source: unknown 12033 1726867185.61738: variable 'ansible_shell_executable' from source: unknown 12033 1726867185.61740: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867185.61742: variable 'ansible_pipelining' from source: unknown 12033 1726867185.61744: variable 'ansible_timeout' from source: unknown 12033 1726867185.61752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867185.61837: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867185.61844: variable 'omit' from source: magic vars 12033 1726867185.61847: starting attempt loop 12033 1726867185.61849: running the handler 12033 1726867185.61856: _low_level_execute_command(): starting 12033 1726867185.61858: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867185.62405: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867185.62409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867185.62411: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867185.62413: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867185.62415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867185.62463: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867185.62466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867185.62533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867185.64111: stdout chunk (state=3): >>>/root <<< 12033 1726867185.64214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867185.64237: stderr chunk (state=3): >>><<< 12033 1726867185.64240: stdout chunk (state=3): >>><<< 12033 1726867185.64252: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867185.64259: _low_level_execute_command(): starting 12033 1726867185.64263: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867185.6425145-13156-186157172858757 `" && echo ansible-tmp-1726867185.6425145-13156-186157172858757="` echo /root/.ansible/tmp/ansible-tmp-1726867185.6425145-13156-186157172858757 `" ) && sleep 0' 12033 1726867185.64806: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867185.64850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867185.64854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867185.64926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867185.66836: stdout chunk (state=3): >>>ansible-tmp-1726867185.6425145-13156-186157172858757=/root/.ansible/tmp/ansible-tmp-1726867185.6425145-13156-186157172858757 <<< 12033 1726867185.67019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867185.67022: stdout chunk (state=3): >>><<< 12033 1726867185.67030: stderr chunk (state=3): >>><<< 12033 1726867185.67162: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867185.6425145-13156-186157172858757=/root/.ansible/tmp/ansible-tmp-1726867185.6425145-13156-186157172858757 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867185.67166: variable 'ansible_module_compression' from source: unknown 12033 1726867185.67303: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867185.67330: variable 'ansible_facts' from source: unknown 12033 1726867185.67542: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867185.6425145-13156-186157172858757/AnsiballZ_command.py 12033 1726867185.67839: Sending initial data 12033 1726867185.67842: Sent initial data (156 bytes) 12033 1726867185.68657: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867185.68663: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867185.68760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867185.69095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867185.69170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867185.70694: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867185.70753: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867185.70789: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmptotym9dk /root/.ansible/tmp/ansible-tmp-1726867185.6425145-13156-186157172858757/AnsiballZ_command.py <<< 12033 1726867185.70847: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867185.6425145-13156-186157172858757/AnsiballZ_command.py" <<< 12033 1726867185.70895: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmptotym9dk" to remote "/root/.ansible/tmp/ansible-tmp-1726867185.6425145-13156-186157172858757/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867185.6425145-13156-186157172858757/AnsiballZ_command.py" <<< 12033 1726867185.72845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867185.72849: stdout chunk (state=3): >>><<< 12033 1726867185.72852: stderr chunk (state=3): >>><<< 12033 1726867185.72854: done transferring module to remote 12033 1726867185.72856: _low_level_execute_command(): starting 12033 1726867185.72858: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867185.6425145-13156-186157172858757/ /root/.ansible/tmp/ansible-tmp-1726867185.6425145-13156-186157172858757/AnsiballZ_command.py && sleep 0' 12033 1726867185.74028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867185.74307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867185.74311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867185.74388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867185.76223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867185.76227: stdout chunk (state=3): >>><<< 12033 1726867185.76230: stderr chunk (state=3): >>><<< 12033 1726867185.76348: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867185.76356: _low_level_execute_command(): starting 12033 1726867185.76359: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867185.6425145-13156-186157172858757/AnsiballZ_command.py && sleep 0' 12033 1726867185.77943: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867185.77980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867185.77989: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867185.78010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867185.78093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867185.93424: stdout chunk (state=3): >>> {"changed": true, "stdout": "stable 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_select"], "start": "2024-09-20 17:19:45.928768", "end": "2024-09-20 17:19:45.931878", "delta": "0:00:00.003110", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_select", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867185.94944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867185.94973: stderr chunk (state=3): >>><<< 12033 1726867185.94980: stdout chunk (state=3): >>><<< 12033 1726867185.95002: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "stable 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_select"], "start": "2024-09-20 17:19:45.928768", "end": "2024-09-20 17:19:45.931878", "delta": "0:00:00.003110", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_select", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867185.95026: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_select', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867185.6425145-13156-186157172858757/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867185.95029: _low_level_execute_command(): starting 12033 1726867185.95032: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867185.6425145-13156-186157172858757/ > /dev/null 2>&1 && sleep 0' 12033 1726867185.95451: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867185.95455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867185.95457: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867185.95459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867185.95496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867185.95500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867185.95566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867185.97583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867185.97586: stdout chunk (state=3): >>><<< 12033 1726867185.97595: stderr chunk (state=3): >>><<< 12033 1726867185.97598: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867185.97604: handler run complete 12033 1726867185.97606: Evaluated conditional (False): False 12033 1726867185.97649: variable 'bond_opt' from source: unknown 12033 1726867185.97658: variable 'result' from source: unknown 12033 1726867185.97672: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867185.97687: attempt loop complete, returning result 12033 1726867185.97706: variable 'bond_opt' from source: unknown 12033 1726867185.97771: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'ad_select', 'value': 'stable'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_select", "value": "stable" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_select" ], "delta": "0:00:00.003110", "end": "2024-09-20 17:19:45.931878", "rc": 0, "start": "2024-09-20 17:19:45.928768" } STDOUT: stable 0 12033 1726867185.97924: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867185.97927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867185.97929: variable 'omit' from source: magic vars 12033 1726867185.98033: variable 'ansible_distribution_major_version' from source: facts 12033 1726867185.98041: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867185.98044: variable 'omit' from source: magic vars 12033 1726867185.98070: variable 'omit' from source: magic vars 12033 1726867185.98214: variable 'controller_device' from source: play vars 12033 1726867185.98217: variable 'bond_opt' from source: unknown 12033 1726867185.98219: variable 'omit' from source: magic vars 12033 1726867185.98251: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867185.98254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867185.98256: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867185.98283: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867185.98286: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867185.98288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867185.98343: Set connection var ansible_pipelining to False 12033 1726867185.98358: Set connection var ansible_shell_executable to /bin/sh 12033 1726867185.98362: Set connection var ansible_timeout to 10 12033 1726867185.98364: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867185.98366: Set connection var ansible_connection to ssh 12033 1726867185.98368: Set connection var ansible_shell_type to sh 12033 1726867185.98583: variable 'ansible_shell_executable' from source: unknown 12033 1726867185.98587: variable 'ansible_connection' from source: unknown 12033 1726867185.98589: variable 'ansible_module_compression' from source: unknown 12033 1726867185.98594: variable 'ansible_shell_type' from source: unknown 12033 1726867185.98596: variable 'ansible_shell_executable' from source: unknown 12033 1726867185.98597: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867185.98599: variable 'ansible_pipelining' from source: unknown 12033 1726867185.98601: variable 'ansible_timeout' from source: unknown 12033 1726867185.98603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867185.98605: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867185.98607: variable 'omit' from source: magic vars 12033 1726867185.98609: starting attempt loop 12033 1726867185.98611: running the handler 12033 1726867185.98613: _low_level_execute_command(): starting 12033 1726867185.98615: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867185.99038: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867185.99041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867185.99075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867185.99085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867185.99088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867185.99094: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867185.99099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867185.99123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867185.99126: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867185.99128: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867185.99172: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867185.99176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867185.99181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867185.99226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867185.99284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867185.99303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867186.00903: stdout chunk (state=3): >>>/root <<< 12033 1726867186.01005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867186.01025: stderr chunk (state=3): >>><<< 12033 1726867186.01029: stdout chunk (state=3): >>><<< 12033 1726867186.01041: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867186.01048: _low_level_execute_command(): starting 12033 1726867186.01052: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867186.0103998-13156-204368468750429 `" && echo ansible-tmp-1726867186.0103998-13156-204368468750429="` echo /root/.ansible/tmp/ansible-tmp-1726867186.0103998-13156-204368468750429 `" ) && sleep 0' 12033 1726867186.01537: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.01541: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867186.01544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.01600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867186.01650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867186.03528: stdout chunk (state=3): >>>ansible-tmp-1726867186.0103998-13156-204368468750429=/root/.ansible/tmp/ansible-tmp-1726867186.0103998-13156-204368468750429 <<< 12033 1726867186.03660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867186.03663: stdout chunk (state=3): >>><<< 12033 1726867186.03665: stderr chunk (state=3): >>><<< 12033 1726867186.03675: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867186.0103998-13156-204368468750429=/root/.ansible/tmp/ansible-tmp-1726867186.0103998-13156-204368468750429 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867186.03695: variable 'ansible_module_compression' from source: unknown 12033 1726867186.03820: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867186.03823: variable 'ansible_facts' from source: unknown 12033 1726867186.03826: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867186.0103998-13156-204368468750429/AnsiballZ_command.py 12033 1726867186.03851: Sending initial data 12033 1726867186.03860: Sent initial data (156 bytes) 12033 1726867186.04280: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.04318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867186.04331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867186.04348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867186.04400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867186.05930: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867186.05972: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867186.06017: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpviomb2oq /root/.ansible/tmp/ansible-tmp-1726867186.0103998-13156-204368468750429/AnsiballZ_command.py <<< 12033 1726867186.06025: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867186.0103998-13156-204368468750429/AnsiballZ_command.py" <<< 12033 1726867186.06059: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpviomb2oq" to remote "/root/.ansible/tmp/ansible-tmp-1726867186.0103998-13156-204368468750429/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867186.0103998-13156-204368468750429/AnsiballZ_command.py" <<< 12033 1726867186.06605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867186.06634: stderr chunk (state=3): >>><<< 12033 1726867186.06637: stdout chunk (state=3): >>><<< 12033 1726867186.06655: done transferring module to remote 12033 1726867186.06661: _low_level_execute_command(): starting 12033 1726867186.06665: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867186.0103998-13156-204368468750429/ /root/.ansible/tmp/ansible-tmp-1726867186.0103998-13156-204368468750429/AnsiballZ_command.py && sleep 0' 12033 1726867186.07045: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867186.07075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867186.07082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867186.07084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.07086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867186.07091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.07143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867186.07147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867186.07192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867186.09012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867186.09032: stderr chunk (state=3): >>><<< 12033 1726867186.09036: stdout chunk (state=3): >>><<< 12033 1726867186.09067: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867186.09070: _low_level_execute_command(): starting 12033 1726867186.09073: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867186.0103998-13156-204368468750429/AnsiballZ_command.py && sleep 0' 12033 1726867186.09453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867186.09495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867186.09499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867186.09501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867186.09503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867186.09505: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.09548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867186.09555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867186.09606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867186.25000: stdout chunk (state=3): >>> {"changed": true, "stdout": "1023", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key"], "start": "2024-09-20 17:19:46.244562", "end": "2024-09-20 17:19:46.247659", "delta": "0:00:00.003097", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_user_port_key", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867186.26583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867186.26587: stdout chunk (state=3): >>><<< 12033 1726867186.26589: stderr chunk (state=3): >>><<< 12033 1726867186.26594: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1023", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key"], "start": "2024-09-20 17:19:46.244562", "end": "2024-09-20 17:19:46.247659", "delta": "0:00:00.003097", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_user_port_key", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867186.26597: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_user_port_key', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867186.0103998-13156-204368468750429/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867186.26599: _low_level_execute_command(): starting 12033 1726867186.26601: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867186.0103998-13156-204368468750429/ > /dev/null 2>&1 && sleep 0' 12033 1726867186.27753: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.27872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867186.28082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867186.28085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867186.28115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867186.29944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867186.30055: stderr chunk (state=3): >>><<< 12033 1726867186.30069: stdout chunk (state=3): >>><<< 12033 1726867186.30162: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867186.30176: handler run complete 12033 1726867186.30206: Evaluated conditional (False): False 12033 1726867186.30598: variable 'bond_opt' from source: unknown 12033 1726867186.30621: variable 'result' from source: unknown 12033 1726867186.30630: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867186.30711: attempt loop complete, returning result 12033 1726867186.30737: variable 'bond_opt' from source: unknown 12033 1726867186.30943: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'ad_user_port_key', 'value': '1023'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_user_port_key", "value": "1023" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key" ], "delta": "0:00:00.003097", "end": "2024-09-20 17:19:46.247659", "rc": 0, "start": "2024-09-20 17:19:46.244562" } STDOUT: 1023 12033 1726867186.31397: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867186.31482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867186.31486: variable 'omit' from source: magic vars 12033 1726867186.31771: variable 'ansible_distribution_major_version' from source: facts 12033 1726867186.31810: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867186.31950: variable 'omit' from source: magic vars 12033 1726867186.31953: variable 'omit' from source: magic vars 12033 1726867186.32557: variable 'controller_device' from source: play vars 12033 1726867186.32560: variable 'bond_opt' from source: unknown 12033 1726867186.32562: variable 'omit' from source: magic vars 12033 1726867186.32565: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867186.32567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867186.32569: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867186.32571: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867186.32573: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867186.32575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867186.32997: Set connection var ansible_pipelining to False 12033 1726867186.33000: Set connection var ansible_shell_executable to /bin/sh 12033 1726867186.33002: Set connection var ansible_timeout to 10 12033 1726867186.33004: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867186.33005: Set connection var ansible_connection to ssh 12033 1726867186.33007: Set connection var ansible_shell_type to sh 12033 1726867186.33009: variable 'ansible_shell_executable' from source: unknown 12033 1726867186.33011: variable 'ansible_connection' from source: unknown 12033 1726867186.33089: variable 'ansible_module_compression' from source: unknown 12033 1726867186.33107: variable 'ansible_shell_type' from source: unknown 12033 1726867186.33114: variable 'ansible_shell_executable' from source: unknown 12033 1726867186.33120: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867186.33128: variable 'ansible_pipelining' from source: unknown 12033 1726867186.33134: variable 'ansible_timeout' from source: unknown 12033 1726867186.33141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867186.33608: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867186.33611: variable 'omit' from source: magic vars 12033 1726867186.33613: starting attempt loop 12033 1726867186.33615: running the handler 12033 1726867186.33617: _low_level_execute_command(): starting 12033 1726867186.33618: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867186.34693: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867186.34696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867186.34704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867186.34792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867186.34797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867186.34934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.34937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867186.34940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867186.34944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867186.34975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867186.36719: stdout chunk (state=3): >>>/root <<< 12033 1726867186.36723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867186.36729: stdout chunk (state=3): >>><<< 12033 1726867186.36732: stderr chunk (state=3): >>><<< 12033 1726867186.36746: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867186.36753: _low_level_execute_command(): starting 12033 1726867186.36757: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867186.367446-13156-279306437447205 `" && echo ansible-tmp-1726867186.367446-13156-279306437447205="` echo /root/.ansible/tmp/ansible-tmp-1726867186.367446-13156-279306437447205 `" ) && sleep 0' 12033 1726867186.37967: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867186.37970: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867186.37972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867186.37975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867186.38026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867186.38029: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867186.38031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.38033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867186.38260: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867186.38275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867186.38354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867186.40530: stdout chunk (state=3): >>>ansible-tmp-1726867186.367446-13156-279306437447205=/root/.ansible/tmp/ansible-tmp-1726867186.367446-13156-279306437447205 <<< 12033 1726867186.40533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867186.40536: stdout chunk (state=3): >>><<< 12033 1726867186.40538: stderr chunk (state=3): >>><<< 12033 1726867186.40540: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867186.367446-13156-279306437447205=/root/.ansible/tmp/ansible-tmp-1726867186.367446-13156-279306437447205 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867186.40660: variable 'ansible_module_compression' from source: unknown 12033 1726867186.40751: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867186.40780: variable 'ansible_facts' from source: unknown 12033 1726867186.41129: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867186.367446-13156-279306437447205/AnsiballZ_command.py 12033 1726867186.41510: Sending initial data 12033 1726867186.41513: Sent initial data (155 bytes) 12033 1726867186.42209: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867186.42218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867186.42242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.42251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867186.42331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867186.42482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867186.42547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867186.44286: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867186.44417: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpk1hanaaz /root/.ansible/tmp/ansible-tmp-1726867186.367446-13156-279306437447205/AnsiballZ_command.py <<< 12033 1726867186.44421: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867186.367446-13156-279306437447205/AnsiballZ_command.py" <<< 12033 1726867186.44438: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpk1hanaaz" to remote "/root/.ansible/tmp/ansible-tmp-1726867186.367446-13156-279306437447205/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867186.367446-13156-279306437447205/AnsiballZ_command.py" <<< 12033 1726867186.46938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867186.46987: stderr chunk (state=3): >>><<< 12033 1726867186.46993: stdout chunk (state=3): >>><<< 12033 1726867186.46996: done transferring module to remote 12033 1726867186.46998: _low_level_execute_command(): starting 12033 1726867186.47000: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867186.367446-13156-279306437447205/ /root/.ansible/tmp/ansible-tmp-1726867186.367446-13156-279306437447205/AnsiballZ_command.py && sleep 0' 12033 1726867186.48150: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867186.48156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867186.48396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867186.50126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867186.50184: stderr chunk (state=3): >>><<< 12033 1726867186.50300: stdout chunk (state=3): >>><<< 12033 1726867186.50385: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867186.50389: _low_level_execute_command(): starting 12033 1726867186.50395: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867186.367446-13156-279306437447205/AnsiballZ_command.py && sleep 0' 12033 1726867186.51574: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867186.51592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867186.51608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867186.51627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867186.51647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867186.51662: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867186.51710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.51713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867186.51790: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867186.51808: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.51842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867186.51857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867186.52007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867186.52291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867186.67543: stdout chunk (state=3): >>> {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/all_slaves_active"], "start": "2024-09-20 17:19:46.669976", "end": "2024-09-20 17:19:46.673077", "delta": "0:00:00.003101", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/all_slaves_active", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867186.69142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867186.69146: stdout chunk (state=3): >>><<< 12033 1726867186.69149: stderr chunk (state=3): >>><<< 12033 1726867186.69171: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/all_slaves_active"], "start": "2024-09-20 17:19:46.669976", "end": "2024-09-20 17:19:46.673077", "delta": "0:00:00.003101", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/all_slaves_active", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867186.69309: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/all_slaves_active', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867186.367446-13156-279306437447205/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867186.69313: _low_level_execute_command(): starting 12033 1726867186.69316: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867186.367446-13156-279306437447205/ > /dev/null 2>&1 && sleep 0' 12033 1726867186.69937: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867186.69950: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867186.70003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867186.70017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867186.70098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.70122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867186.70139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867186.70161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867186.70246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867186.72049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867186.72075: stderr chunk (state=3): >>><<< 12033 1726867186.72080: stdout chunk (state=3): >>><<< 12033 1726867186.72097: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867186.72101: handler run complete 12033 1726867186.72118: Evaluated conditional (False): False 12033 1726867186.72223: variable 'bond_opt' from source: unknown 12033 1726867186.72228: variable 'result' from source: unknown 12033 1726867186.72238: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867186.72247: attempt loop complete, returning result 12033 1726867186.72260: variable 'bond_opt' from source: unknown 12033 1726867186.72311: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'all_slaves_active', 'value': '1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "all_slaves_active", "value": "1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/all_slaves_active" ], "delta": "0:00:00.003101", "end": "2024-09-20 17:19:46.673077", "rc": 0, "start": "2024-09-20 17:19:46.669976" } STDOUT: 1 12033 1726867186.72437: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867186.72440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867186.72443: variable 'omit' from source: magic vars 12033 1726867186.72529: variable 'ansible_distribution_major_version' from source: facts 12033 1726867186.72533: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867186.72537: variable 'omit' from source: magic vars 12033 1726867186.72549: variable 'omit' from source: magic vars 12033 1726867186.72658: variable 'controller_device' from source: play vars 12033 1726867186.72675: variable 'bond_opt' from source: unknown 12033 1726867186.72680: variable 'omit' from source: magic vars 12033 1726867186.72699: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867186.72705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867186.72711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867186.72720: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867186.72723: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867186.72725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867186.72780: Set connection var ansible_pipelining to False 12033 1726867186.72784: Set connection var ansible_shell_executable to /bin/sh 12033 1726867186.72786: Set connection var ansible_timeout to 10 12033 1726867186.72788: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867186.72791: Set connection var ansible_connection to ssh 12033 1726867186.72800: Set connection var ansible_shell_type to sh 12033 1726867186.72813: variable 'ansible_shell_executable' from source: unknown 12033 1726867186.72816: variable 'ansible_connection' from source: unknown 12033 1726867186.72818: variable 'ansible_module_compression' from source: unknown 12033 1726867186.72820: variable 'ansible_shell_type' from source: unknown 12033 1726867186.72822: variable 'ansible_shell_executable' from source: unknown 12033 1726867186.72824: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867186.72829: variable 'ansible_pipelining' from source: unknown 12033 1726867186.72831: variable 'ansible_timeout' from source: unknown 12033 1726867186.72836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867186.72902: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867186.72909: variable 'omit' from source: magic vars 12033 1726867186.72911: starting attempt loop 12033 1726867186.72913: running the handler 12033 1726867186.72920: _low_level_execute_command(): starting 12033 1726867186.72923: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867186.73345: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867186.73350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.73352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867186.73354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867186.73356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.73401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867186.73406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867186.73452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867186.75046: stdout chunk (state=3): >>>/root <<< 12033 1726867186.75150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867186.75170: stderr chunk (state=3): >>><<< 12033 1726867186.75174: stdout chunk (state=3): >>><<< 12033 1726867186.75187: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867186.75196: _low_level_execute_command(): starting 12033 1726867186.75201: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867186.7518604-13156-249749031283230 `" && echo ansible-tmp-1726867186.7518604-13156-249749031283230="` echo /root/.ansible/tmp/ansible-tmp-1726867186.7518604-13156-249749031283230 `" ) && sleep 0' 12033 1726867186.75602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867186.75605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867186.75607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867186.75613: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867186.75615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.75697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867186.75730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867186.77660: stdout chunk (state=3): >>>ansible-tmp-1726867186.7518604-13156-249749031283230=/root/.ansible/tmp/ansible-tmp-1726867186.7518604-13156-249749031283230 <<< 12033 1726867186.77764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867186.77783: stderr chunk (state=3): >>><<< 12033 1726867186.77786: stdout chunk (state=3): >>><<< 12033 1726867186.77800: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867186.7518604-13156-249749031283230=/root/.ansible/tmp/ansible-tmp-1726867186.7518604-13156-249749031283230 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867186.77815: variable 'ansible_module_compression' from source: unknown 12033 1726867186.77841: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867186.77854: variable 'ansible_facts' from source: unknown 12033 1726867186.77910: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867186.7518604-13156-249749031283230/AnsiballZ_command.py 12033 1726867186.78004: Sending initial data 12033 1726867186.78007: Sent initial data (156 bytes) 12033 1726867186.78396: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867186.78400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.78427: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.78467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867186.78471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867186.78524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867186.80221: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867186.80374: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867186.80456: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpjn7qv90x /root/.ansible/tmp/ansible-tmp-1726867186.7518604-13156-249749031283230/AnsiballZ_command.py <<< 12033 1726867186.80460: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867186.7518604-13156-249749031283230/AnsiballZ_command.py" <<< 12033 1726867186.80684: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpjn7qv90x" to remote "/root/.ansible/tmp/ansible-tmp-1726867186.7518604-13156-249749031283230/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867186.7518604-13156-249749031283230/AnsiballZ_command.py" <<< 12033 1726867186.83449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867186.83456: stdout chunk (state=3): >>><<< 12033 1726867186.83472: stderr chunk (state=3): >>><<< 12033 1726867186.83555: done transferring module to remote 12033 1726867186.83563: _low_level_execute_command(): starting 12033 1726867186.83566: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867186.7518604-13156-249749031283230/ /root/.ansible/tmp/ansible-tmp-1726867186.7518604-13156-249749031283230/AnsiballZ_command.py && sleep 0' 12033 1726867186.84688: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867186.84695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867186.84702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867186.84716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867186.84731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867186.84737: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867186.84745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.84763: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867186.84767: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867186.84773: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867186.84784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867186.84795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867186.84806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867186.84814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867186.84821: stderr chunk (state=3): >>>debug2: match found <<< 12033 1726867186.84831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.84897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867186.84917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867186.84981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867186.85036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867186.86870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867186.86893: stderr chunk (state=3): >>><<< 12033 1726867186.86898: stdout chunk (state=3): >>><<< 12033 1726867186.86914: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867186.86917: _low_level_execute_command(): starting 12033 1726867186.86922: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867186.7518604-13156-249749031283230/AnsiballZ_command.py && sleep 0' 12033 1726867186.87382: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867186.87401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867186.87504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867186.87508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867186.87557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.03414: stdout chunk (state=3): >>> {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/downdelay"], "start": "2024-09-20 17:19:47.027368", "end": "2024-09-20 17:19:47.030467", "delta": "0:00:00.003099", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/downdelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867187.05085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867187.05089: stdout chunk (state=3): >>><<< 12033 1726867187.05283: stderr chunk (state=3): >>><<< 12033 1726867187.05287: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/downdelay"], "start": "2024-09-20 17:19:47.027368", "end": "2024-09-20 17:19:47.030467", "delta": "0:00:00.003099", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/downdelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867187.05293: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/downdelay', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867186.7518604-13156-249749031283230/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867187.05295: _low_level_execute_command(): starting 12033 1726867187.05297: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867186.7518604-13156-249749031283230/ > /dev/null 2>&1 && sleep 0' 12033 1726867187.06028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867187.06388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867187.06413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.06489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.08546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867187.08549: stdout chunk (state=3): >>><<< 12033 1726867187.08551: stderr chunk (state=3): >>><<< 12033 1726867187.08566: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867187.08575: handler run complete 12033 1726867187.08606: Evaluated conditional (False): False 12033 1726867187.08756: variable 'bond_opt' from source: unknown 12033 1726867187.09382: variable 'result' from source: unknown 12033 1726867187.09385: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867187.09387: attempt loop complete, returning result 12033 1726867187.09390: variable 'bond_opt' from source: unknown 12033 1726867187.09394: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'downdelay', 'value': '0'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "downdelay", "value": "0" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/downdelay" ], "delta": "0:00:00.003099", "end": "2024-09-20 17:19:47.030467", "rc": 0, "start": "2024-09-20 17:19:47.027368" } STDOUT: 0 12033 1726867187.10033: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867187.10037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867187.10042: variable 'omit' from source: magic vars 12033 1726867187.10574: variable 'ansible_distribution_major_version' from source: facts 12033 1726867187.10579: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867187.10585: variable 'omit' from source: magic vars 12033 1726867187.10602: variable 'omit' from source: magic vars 12033 1726867187.10944: variable 'controller_device' from source: play vars 12033 1726867187.11288: variable 'bond_opt' from source: unknown 12033 1726867187.11291: variable 'omit' from source: magic vars 12033 1726867187.11294: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867187.11296: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867187.11298: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867187.11301: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867187.11303: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867187.11305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867187.11356: Set connection var ansible_pipelining to False 12033 1726867187.11371: Set connection var ansible_shell_executable to /bin/sh 12033 1726867187.11388: Set connection var ansible_timeout to 10 12033 1726867187.11507: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867187.11510: Set connection var ansible_connection to ssh 12033 1726867187.11512: Set connection var ansible_shell_type to sh 12033 1726867187.11514: variable 'ansible_shell_executable' from source: unknown 12033 1726867187.11516: variable 'ansible_connection' from source: unknown 12033 1726867187.11518: variable 'ansible_module_compression' from source: unknown 12033 1726867187.11520: variable 'ansible_shell_type' from source: unknown 12033 1726867187.11521: variable 'ansible_shell_executable' from source: unknown 12033 1726867187.11523: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867187.11525: variable 'ansible_pipelining' from source: unknown 12033 1726867187.11527: variable 'ansible_timeout' from source: unknown 12033 1726867187.11528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867187.11664: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867187.11735: variable 'omit' from source: magic vars 12033 1726867187.11744: starting attempt loop 12033 1726867187.11751: running the handler 12033 1726867187.11762: _low_level_execute_command(): starting 12033 1726867187.11770: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867187.13264: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.13432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867187.13605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867187.13639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.13713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.15310: stdout chunk (state=3): >>>/root <<< 12033 1726867187.15509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867187.15512: stdout chunk (state=3): >>><<< 12033 1726867187.15514: stderr chunk (state=3): >>><<< 12033 1726867187.15527: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867187.15545: _low_level_execute_command(): starting 12033 1726867187.15632: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867187.155311-13156-30664868089961 `" && echo ansible-tmp-1726867187.155311-13156-30664868089961="` echo /root/.ansible/tmp/ansible-tmp-1726867187.155311-13156-30664868089961 `" ) && sleep 0' 12033 1726867187.16920: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867187.16932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867187.16935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.17075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.18842: stdout chunk (state=3): >>>ansible-tmp-1726867187.155311-13156-30664868089961=/root/.ansible/tmp/ansible-tmp-1726867187.155311-13156-30664868089961 <<< 12033 1726867187.19231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867187.19240: stdout chunk (state=3): >>><<< 12033 1726867187.19243: stderr chunk (state=3): >>><<< 12033 1726867187.19258: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867187.155311-13156-30664868089961=/root/.ansible/tmp/ansible-tmp-1726867187.155311-13156-30664868089961 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867187.19282: variable 'ansible_module_compression' from source: unknown 12033 1726867187.19321: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867187.19361: variable 'ansible_facts' from source: unknown 12033 1726867187.19618: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867187.155311-13156-30664868089961/AnsiballZ_command.py 12033 1726867187.19768: Sending initial data 12033 1726867187.19771: Sent initial data (154 bytes) 12033 1726867187.21162: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867187.21170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867187.21187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.21580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.23133: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867187.155311-13156-30664868089961/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpdptrl8wd" to remote "/root/.ansible/tmp/ansible-tmp-1726867187.155311-13156-30664868089961/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867187.155311-13156-30664868089961/AnsiballZ_command.py" <<< 12033 1726867187.23143: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpdptrl8wd /root/.ansible/tmp/ansible-tmp-1726867187.155311-13156-30664868089961/AnsiballZ_command.py <<< 12033 1726867187.24888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867187.24892: stderr chunk (state=3): >>><<< 12033 1726867187.24894: stdout chunk (state=3): >>><<< 12033 1726867187.24896: done transferring module to remote 12033 1726867187.24898: _low_level_execute_command(): starting 12033 1726867187.24900: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867187.155311-13156-30664868089961/ /root/.ansible/tmp/ansible-tmp-1726867187.155311-13156-30664868089961/AnsiballZ_command.py && sleep 0' 12033 1726867187.25939: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.25981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867187.25993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867187.26180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.26227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.28041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867187.28282: stdout chunk (state=3): >>><<< 12033 1726867187.28286: stderr chunk (state=3): >>><<< 12033 1726867187.28288: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867187.28294: _low_level_execute_command(): starting 12033 1726867187.28296: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867187.155311-13156-30664868089961/AnsiballZ_command.py && sleep 0' 12033 1726867187.29648: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867187.29802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.29859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.45306: stdout chunk (state=3): >>> {"changed": true, "stdout": "slow 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lacp_rate"], "start": "2024-09-20 17:19:47.447533", "end": "2024-09-20 17:19:47.450711", "delta": "0:00:00.003178", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lacp_rate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867187.46776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867187.46803: stderr chunk (state=3): >>><<< 12033 1726867187.46806: stdout chunk (state=3): >>><<< 12033 1726867187.46823: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "slow 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lacp_rate"], "start": "2024-09-20 17:19:47.447533", "end": "2024-09-20 17:19:47.450711", "delta": "0:00:00.003178", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lacp_rate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867187.46847: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/lacp_rate', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867187.155311-13156-30664868089961/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867187.46851: _low_level_execute_command(): starting 12033 1726867187.46856: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867187.155311-13156-30664868089961/ > /dev/null 2>&1 && sleep 0' 12033 1726867187.47260: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867187.47264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867187.47300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.47303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867187.47306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.47353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867187.47360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867187.47362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.47407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.49220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867187.49242: stderr chunk (state=3): >>><<< 12033 1726867187.49245: stdout chunk (state=3): >>><<< 12033 1726867187.49262: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867187.49268: handler run complete 12033 1726867187.49282: Evaluated conditional (False): False 12033 1726867187.49383: variable 'bond_opt' from source: unknown 12033 1726867187.49389: variable 'result' from source: unknown 12033 1726867187.49400: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867187.49409: attempt loop complete, returning result 12033 1726867187.49423: variable 'bond_opt' from source: unknown 12033 1726867187.49471: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'lacp_rate', 'value': 'slow'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "lacp_rate", "value": "slow" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/lacp_rate" ], "delta": "0:00:00.003178", "end": "2024-09-20 17:19:47.450711", "rc": 0, "start": "2024-09-20 17:19:47.447533" } STDOUT: slow 0 12033 1726867187.49598: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867187.49601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867187.49603: variable 'omit' from source: magic vars 12033 1726867187.49683: variable 'ansible_distribution_major_version' from source: facts 12033 1726867187.49688: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867187.49694: variable 'omit' from source: magic vars 12033 1726867187.49704: variable 'omit' from source: magic vars 12033 1726867187.49809: variable 'controller_device' from source: play vars 12033 1726867187.49812: variable 'bond_opt' from source: unknown 12033 1726867187.49828: variable 'omit' from source: magic vars 12033 1726867187.49844: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867187.49850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867187.49856: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867187.49865: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867187.49868: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867187.49870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867187.49919: Set connection var ansible_pipelining to False 12033 1726867187.49925: Set connection var ansible_shell_executable to /bin/sh 12033 1726867187.49935: Set connection var ansible_timeout to 10 12033 1726867187.49939: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867187.49941: Set connection var ansible_connection to ssh 12033 1726867187.49943: Set connection var ansible_shell_type to sh 12033 1726867187.49958: variable 'ansible_shell_executable' from source: unknown 12033 1726867187.49961: variable 'ansible_connection' from source: unknown 12033 1726867187.49964: variable 'ansible_module_compression' from source: unknown 12033 1726867187.49966: variable 'ansible_shell_type' from source: unknown 12033 1726867187.49968: variable 'ansible_shell_executable' from source: unknown 12033 1726867187.49970: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867187.49974: variable 'ansible_pipelining' from source: unknown 12033 1726867187.49976: variable 'ansible_timeout' from source: unknown 12033 1726867187.49982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867187.50045: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867187.50048: variable 'omit' from source: magic vars 12033 1726867187.50052: starting attempt loop 12033 1726867187.50059: running the handler 12033 1726867187.50062: _low_level_execute_command(): starting 12033 1726867187.50064: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867187.50468: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867187.50471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.50476: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867187.50479: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867187.50481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.50529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867187.50532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.50584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.52171: stdout chunk (state=3): >>>/root <<< 12033 1726867187.52273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867187.52296: stderr chunk (state=3): >>><<< 12033 1726867187.52299: stdout chunk (state=3): >>><<< 12033 1726867187.52359: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867187.52363: _low_level_execute_command(): starting 12033 1726867187.52365: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867187.523146-13156-20679727308124 `" && echo ansible-tmp-1726867187.523146-13156-20679727308124="` echo /root/.ansible/tmp/ansible-tmp-1726867187.523146-13156-20679727308124 `" ) && sleep 0' 12033 1726867187.52900: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867187.52986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867187.53008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867187.53015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.53089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.54950: stdout chunk (state=3): >>>ansible-tmp-1726867187.523146-13156-20679727308124=/root/.ansible/tmp/ansible-tmp-1726867187.523146-13156-20679727308124 <<< 12033 1726867187.55062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867187.55082: stderr chunk (state=3): >>><<< 12033 1726867187.55085: stdout chunk (state=3): >>><<< 12033 1726867187.55097: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867187.523146-13156-20679727308124=/root/.ansible/tmp/ansible-tmp-1726867187.523146-13156-20679727308124 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867187.55111: variable 'ansible_module_compression' from source: unknown 12033 1726867187.55137: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867187.55151: variable 'ansible_facts' from source: unknown 12033 1726867187.55199: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867187.523146-13156-20679727308124/AnsiballZ_command.py 12033 1726867187.55274: Sending initial data 12033 1726867187.55279: Sent initial data (154 bytes) 12033 1726867187.55682: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867187.55685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.55687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867187.55694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867187.55696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.55789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.55842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.57363: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 12033 1726867187.57366: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867187.57402: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867187.57444: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpxct89d4b /root/.ansible/tmp/ansible-tmp-1726867187.523146-13156-20679727308124/AnsiballZ_command.py <<< 12033 1726867187.57449: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867187.523146-13156-20679727308124/AnsiballZ_command.py" <<< 12033 1726867187.57487: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpxct89d4b" to remote "/root/.ansible/tmp/ansible-tmp-1726867187.523146-13156-20679727308124/AnsiballZ_command.py" <<< 12033 1726867187.57497: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867187.523146-13156-20679727308124/AnsiballZ_command.py" <<< 12033 1726867187.58031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867187.58062: stderr chunk (state=3): >>><<< 12033 1726867187.58066: stdout chunk (state=3): >>><<< 12033 1726867187.58094: done transferring module to remote 12033 1726867187.58097: _low_level_execute_command(): starting 12033 1726867187.58105: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867187.523146-13156-20679727308124/ /root/.ansible/tmp/ansible-tmp-1726867187.523146-13156-20679727308124/AnsiballZ_command.py && sleep 0' 12033 1726867187.58506: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867187.58509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867187.58511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867187.58513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867187.58515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.58567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867187.58571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.58613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.60345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867187.60367: stderr chunk (state=3): >>><<< 12033 1726867187.60370: stdout chunk (state=3): >>><<< 12033 1726867187.60384: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867187.60387: _low_level_execute_command(): starting 12033 1726867187.60389: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867187.523146-13156-20679727308124/AnsiballZ_command.py && sleep 0' 12033 1726867187.60775: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867187.60781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.60788: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867187.60792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867187.60795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.60842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867187.60848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.60896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.76350: stdout chunk (state=3): >>> {"changed": true, "stdout": "128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lp_interval"], "start": "2024-09-20 17:19:47.758134", "end": "2024-09-20 17:19:47.761263", "delta": "0:00:00.003129", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867187.77843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867187.77871: stderr chunk (state=3): >>><<< 12033 1726867187.77874: stdout chunk (state=3): >>><<< 12033 1726867187.77888: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lp_interval"], "start": "2024-09-20 17:19:47.758134", "end": "2024-09-20 17:19:47.761263", "delta": "0:00:00.003129", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867187.77910: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/lp_interval', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867187.523146-13156-20679727308124/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867187.77913: _low_level_execute_command(): starting 12033 1726867187.77918: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867187.523146-13156-20679727308124/ > /dev/null 2>&1 && sleep 0' 12033 1726867187.78347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867187.78350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.78352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867187.78355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867187.78357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.78403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867187.78406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867187.78410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.78456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.80258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867187.80285: stderr chunk (state=3): >>><<< 12033 1726867187.80288: stdout chunk (state=3): >>><<< 12033 1726867187.80301: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867187.80304: handler run complete 12033 1726867187.80319: Evaluated conditional (False): False 12033 1726867187.80422: variable 'bond_opt' from source: unknown 12033 1726867187.80426: variable 'result' from source: unknown 12033 1726867187.80438: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867187.80446: attempt loop complete, returning result 12033 1726867187.80460: variable 'bond_opt' from source: unknown 12033 1726867187.80513: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'lp_interval', 'value': '128'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "lp_interval", "value": "128" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/lp_interval" ], "delta": "0:00:00.003129", "end": "2024-09-20 17:19:47.761263", "rc": 0, "start": "2024-09-20 17:19:47.758134" } STDOUT: 128 12033 1726867187.80636: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867187.80639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867187.80642: variable 'omit' from source: magic vars 12033 1726867187.80727: variable 'ansible_distribution_major_version' from source: facts 12033 1726867187.80730: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867187.80734: variable 'omit' from source: magic vars 12033 1726867187.80745: variable 'omit' from source: magic vars 12033 1726867187.80851: variable 'controller_device' from source: play vars 12033 1726867187.80855: variable 'bond_opt' from source: unknown 12033 1726867187.80871: variable 'omit' from source: magic vars 12033 1726867187.80888: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867187.80897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867187.80903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867187.80913: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867187.80915: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867187.80918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867187.80963: Set connection var ansible_pipelining to False 12033 1726867187.80970: Set connection var ansible_shell_executable to /bin/sh 12033 1726867187.80982: Set connection var ansible_timeout to 10 12033 1726867187.80985: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867187.80987: Set connection var ansible_connection to ssh 12033 1726867187.80989: Set connection var ansible_shell_type to sh 12033 1726867187.81005: variable 'ansible_shell_executable' from source: unknown 12033 1726867187.81008: variable 'ansible_connection' from source: unknown 12033 1726867187.81010: variable 'ansible_module_compression' from source: unknown 12033 1726867187.81013: variable 'ansible_shell_type' from source: unknown 12033 1726867187.81015: variable 'ansible_shell_executable' from source: unknown 12033 1726867187.81017: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867187.81020: variable 'ansible_pipelining' from source: unknown 12033 1726867187.81022: variable 'ansible_timeout' from source: unknown 12033 1726867187.81027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867187.81088: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867187.81096: variable 'omit' from source: magic vars 12033 1726867187.81099: starting attempt loop 12033 1726867187.81101: running the handler 12033 1726867187.81108: _low_level_execute_command(): starting 12033 1726867187.81110: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867187.81511: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867187.81517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867187.81519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867187.81522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.81568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867187.81572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.81620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.83198: stdout chunk (state=3): >>>/root <<< 12033 1726867187.83300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867187.83322: stderr chunk (state=3): >>><<< 12033 1726867187.83325: stdout chunk (state=3): >>><<< 12033 1726867187.83336: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867187.83342: _low_level_execute_command(): starting 12033 1726867187.83348: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867187.8333504-13156-31413231325044 `" && echo ansible-tmp-1726867187.8333504-13156-31413231325044="` echo /root/.ansible/tmp/ansible-tmp-1726867187.8333504-13156-31413231325044 `" ) && sleep 0' 12033 1726867187.83747: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867187.83750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.83752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867187.83754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867187.83756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.83816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867187.83818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867187.83820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.83858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.85739: stdout chunk (state=3): >>>ansible-tmp-1726867187.8333504-13156-31413231325044=/root/.ansible/tmp/ansible-tmp-1726867187.8333504-13156-31413231325044 <<< 12033 1726867187.85907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867187.85915: stdout chunk (state=3): >>><<< 12033 1726867187.85919: stderr chunk (state=3): >>><<< 12033 1726867187.85937: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867187.8333504-13156-31413231325044=/root/.ansible/tmp/ansible-tmp-1726867187.8333504-13156-31413231325044 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867187.86024: variable 'ansible_module_compression' from source: unknown 12033 1726867187.86029: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867187.86044: variable 'ansible_facts' from source: unknown 12033 1726867187.86132: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867187.8333504-13156-31413231325044/AnsiballZ_command.py 12033 1726867187.86263: Sending initial data 12033 1726867187.86367: Sent initial data (155 bytes) 12033 1726867187.86918: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867187.86934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867187.86949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867187.87004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867187.87022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.87116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867187.87149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.87225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.88760: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867187.88810: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867187.88864: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmplb0z8buk /root/.ansible/tmp/ansible-tmp-1726867187.8333504-13156-31413231325044/AnsiballZ_command.py <<< 12033 1726867187.88868: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867187.8333504-13156-31413231325044/AnsiballZ_command.py" <<< 12033 1726867187.88919: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmplb0z8buk" to remote "/root/.ansible/tmp/ansible-tmp-1726867187.8333504-13156-31413231325044/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867187.8333504-13156-31413231325044/AnsiballZ_command.py" <<< 12033 1726867187.89711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867187.89837: stderr chunk (state=3): >>><<< 12033 1726867187.89840: stdout chunk (state=3): >>><<< 12033 1726867187.89842: done transferring module to remote 12033 1726867187.89844: _low_level_execute_command(): starting 12033 1726867187.89847: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867187.8333504-13156-31413231325044/ /root/.ansible/tmp/ansible-tmp-1726867187.8333504-13156-31413231325044/AnsiballZ_command.py && sleep 0' 12033 1726867187.90456: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867187.90472: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867187.90510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867187.90617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867187.90635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867187.90652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867187.90673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.90753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867187.92529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867187.92539: stdout chunk (state=3): >>><<< 12033 1726867187.92551: stderr chunk (state=3): >>><<< 12033 1726867187.92569: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867187.92581: _low_level_execute_command(): starting 12033 1726867187.92599: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867187.8333504-13156-31413231325044/AnsiballZ_command.py && sleep 0' 12033 1726867187.93206: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867187.93221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867187.93332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867187.93348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867187.93373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867187.93395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867187.93487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.08913: stdout chunk (state=3): >>> {"changed": true, "stdout": "110", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/miimon"], "start": "2024-09-20 17:19:48.083550", "end": "2024-09-20 17:19:48.086692", "delta": "0:00:00.003142", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/miimon", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867188.10686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867188.10692: stdout chunk (state=3): >>><<< 12033 1726867188.10695: stderr chunk (state=3): >>><<< 12033 1726867188.10698: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "110", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/miimon"], "start": "2024-09-20 17:19:48.083550", "end": "2024-09-20 17:19:48.086692", "delta": "0:00:00.003142", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/miimon", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867188.10700: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/miimon', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867187.8333504-13156-31413231325044/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867188.10706: _low_level_execute_command(): starting 12033 1726867188.10709: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867187.8333504-13156-31413231325044/ > /dev/null 2>&1 && sleep 0' 12033 1726867188.11321: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867188.11334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867188.11350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867188.11387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867188.11489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867188.11515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.11609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.13500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867188.13503: stdout chunk (state=3): >>><<< 12033 1726867188.13505: stderr chunk (state=3): >>><<< 12033 1726867188.13520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867188.13529: handler run complete 12033 1726867188.13552: Evaluated conditional (False): False 12033 1726867188.13800: variable 'bond_opt' from source: unknown 12033 1726867188.13803: variable 'result' from source: unknown 12033 1726867188.13805: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867188.13807: attempt loop complete, returning result 12033 1726867188.13809: variable 'bond_opt' from source: unknown 12033 1726867188.13847: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'miimon', 'value': '110'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "miimon", "value": "110" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/miimon" ], "delta": "0:00:00.003142", "end": "2024-09-20 17:19:48.086692", "rc": 0, "start": "2024-09-20 17:19:48.083550" } STDOUT: 110 12033 1726867188.14089: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867188.14098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867188.14113: variable 'omit' from source: magic vars 12033 1726867188.14280: variable 'ansible_distribution_major_version' from source: facts 12033 1726867188.14307: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867188.14382: variable 'omit' from source: magic vars 12033 1726867188.14385: variable 'omit' from source: magic vars 12033 1726867188.14509: variable 'controller_device' from source: play vars 12033 1726867188.14528: variable 'bond_opt' from source: unknown 12033 1726867188.14550: variable 'omit' from source: magic vars 12033 1726867188.14573: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867188.14588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867188.14602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867188.14618: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867188.14634: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867188.14641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867188.14720: Set connection var ansible_pipelining to False 12033 1726867188.14742: Set connection var ansible_shell_executable to /bin/sh 12033 1726867188.14755: Set connection var ansible_timeout to 10 12033 1726867188.14763: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867188.14769: Set connection var ansible_connection to ssh 12033 1726867188.14849: Set connection var ansible_shell_type to sh 12033 1726867188.14851: variable 'ansible_shell_executable' from source: unknown 12033 1726867188.14853: variable 'ansible_connection' from source: unknown 12033 1726867188.14855: variable 'ansible_module_compression' from source: unknown 12033 1726867188.14857: variable 'ansible_shell_type' from source: unknown 12033 1726867188.14859: variable 'ansible_shell_executable' from source: unknown 12033 1726867188.14861: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867188.14862: variable 'ansible_pipelining' from source: unknown 12033 1726867188.14864: variable 'ansible_timeout' from source: unknown 12033 1726867188.14866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867188.14943: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867188.14965: variable 'omit' from source: magic vars 12033 1726867188.14973: starting attempt loop 12033 1726867188.14981: running the handler 12033 1726867188.14994: _low_level_execute_command(): starting 12033 1726867188.15002: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867188.15647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867188.15661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867188.15674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867188.15697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867188.15815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867188.15831: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867188.15851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.15925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.17520: stdout chunk (state=3): >>>/root <<< 12033 1726867188.17658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867188.17679: stdout chunk (state=3): >>><<< 12033 1726867188.17775: stderr chunk (state=3): >>><<< 12033 1726867188.17780: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867188.17783: _low_level_execute_command(): starting 12033 1726867188.17785: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867188.1770775-13156-55084400091606 `" && echo ansible-tmp-1726867188.1770775-13156-55084400091606="` echo /root/.ansible/tmp/ansible-tmp-1726867188.1770775-13156-55084400091606 `" ) && sleep 0' 12033 1726867188.18340: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867188.18353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867188.18367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867188.18386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867188.18406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867188.18447: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867188.18520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867188.18544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867188.18568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.18650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.20521: stdout chunk (state=3): >>>ansible-tmp-1726867188.1770775-13156-55084400091606=/root/.ansible/tmp/ansible-tmp-1726867188.1770775-13156-55084400091606 <<< 12033 1726867188.20667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867188.20688: stdout chunk (state=3): >>><<< 12033 1726867188.20693: stderr chunk (state=3): >>><<< 12033 1726867188.20786: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867188.1770775-13156-55084400091606=/root/.ansible/tmp/ansible-tmp-1726867188.1770775-13156-55084400091606 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867188.20790: variable 'ansible_module_compression' from source: unknown 12033 1726867188.20795: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867188.20797: variable 'ansible_facts' from source: unknown 12033 1726867188.20874: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867188.1770775-13156-55084400091606/AnsiballZ_command.py 12033 1726867188.21026: Sending initial data 12033 1726867188.21029: Sent initial data (155 bytes) 12033 1726867188.21668: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867188.21688: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867188.21783: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867188.21808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.21892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.23428: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867188.23475: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867188.23530: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpiobve4rq /root/.ansible/tmp/ansible-tmp-1726867188.1770775-13156-55084400091606/AnsiballZ_command.py <<< 12033 1726867188.23533: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867188.1770775-13156-55084400091606/AnsiballZ_command.py" <<< 12033 1726867188.23585: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpiobve4rq" to remote "/root/.ansible/tmp/ansible-tmp-1726867188.1770775-13156-55084400091606/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867188.1770775-13156-55084400091606/AnsiballZ_command.py" <<< 12033 1726867188.24342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867188.24388: stderr chunk (state=3): >>><<< 12033 1726867188.24400: stdout chunk (state=3): >>><<< 12033 1726867188.24538: done transferring module to remote 12033 1726867188.24542: _low_level_execute_command(): starting 12033 1726867188.24544: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867188.1770775-13156-55084400091606/ /root/.ansible/tmp/ansible-tmp-1726867188.1770775-13156-55084400091606/AnsiballZ_command.py && sleep 0' 12033 1726867188.25075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867188.25092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867188.25108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867188.25136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867188.25249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867188.25262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867188.25280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.25364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.27152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867188.27162: stdout chunk (state=3): >>><<< 12033 1726867188.27172: stderr chunk (state=3): >>><<< 12033 1726867188.27196: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867188.27207: _low_level_execute_command(): starting 12033 1726867188.27222: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867188.1770775-13156-55084400091606/AnsiballZ_command.py && sleep 0' 12033 1726867188.27836: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867188.27851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867188.27864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867188.27883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867188.27904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867188.27948: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867188.28017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867188.28043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867188.28068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.28158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.43457: stdout chunk (state=3): >>> {"changed": true, "stdout": "64", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/num_grat_arp"], "start": "2024-09-20 17:19:48.429017", "end": "2024-09-20 17:19:48.432256", "delta": "0:00:00.003239", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/num_grat_arp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867188.44965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867188.44993: stderr chunk (state=3): >>><<< 12033 1726867188.44997: stdout chunk (state=3): >>><<< 12033 1726867188.45009: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "64", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/num_grat_arp"], "start": "2024-09-20 17:19:48.429017", "end": "2024-09-20 17:19:48.432256", "delta": "0:00:00.003239", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/num_grat_arp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867188.45029: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/num_grat_arp', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867188.1770775-13156-55084400091606/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867188.45034: _low_level_execute_command(): starting 12033 1726867188.45038: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867188.1770775-13156-55084400091606/ > /dev/null 2>&1 && sleep 0' 12033 1726867188.45584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867188.45609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.45685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.47556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867188.47560: stdout chunk (state=3): >>><<< 12033 1726867188.47563: stderr chunk (state=3): >>><<< 12033 1726867188.47884: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867188.47888: handler run complete 12033 1726867188.47893: Evaluated conditional (False): False 12033 1726867188.47896: variable 'bond_opt' from source: unknown 12033 1726867188.47898: variable 'result' from source: unknown 12033 1726867188.47900: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867188.48096: attempt loop complete, returning result 12033 1726867188.48119: variable 'bond_opt' from source: unknown 12033 1726867188.48187: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'num_grat_arp', 'value': '64'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "num_grat_arp", "value": "64" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/num_grat_arp" ], "delta": "0:00:00.003239", "end": "2024-09-20 17:19:48.432256", "rc": 0, "start": "2024-09-20 17:19:48.429017" } STDOUT: 64 12033 1726867188.48883: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867188.48886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867188.48888: variable 'omit' from source: magic vars 12033 1726867188.48893: variable 'ansible_distribution_major_version' from source: facts 12033 1726867188.48895: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867188.48897: variable 'omit' from source: magic vars 12033 1726867188.48899: variable 'omit' from source: magic vars 12033 1726867188.49238: variable 'controller_device' from source: play vars 12033 1726867188.49249: variable 'bond_opt' from source: unknown 12033 1726867188.49271: variable 'omit' from source: magic vars 12033 1726867188.49408: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867188.49423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867188.49433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867188.49451: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867188.49457: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867188.49469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867188.49542: Set connection var ansible_pipelining to False 12033 1726867188.49783: Set connection var ansible_shell_executable to /bin/sh 12033 1726867188.49787: Set connection var ansible_timeout to 10 12033 1726867188.49789: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867188.49794: Set connection var ansible_connection to ssh 12033 1726867188.49796: Set connection var ansible_shell_type to sh 12033 1726867188.49798: variable 'ansible_shell_executable' from source: unknown 12033 1726867188.49800: variable 'ansible_connection' from source: unknown 12033 1726867188.49801: variable 'ansible_module_compression' from source: unknown 12033 1726867188.49803: variable 'ansible_shell_type' from source: unknown 12033 1726867188.49805: variable 'ansible_shell_executable' from source: unknown 12033 1726867188.49807: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867188.49809: variable 'ansible_pipelining' from source: unknown 12033 1726867188.49811: variable 'ansible_timeout' from source: unknown 12033 1726867188.49813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867188.50050: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867188.50147: variable 'omit' from source: magic vars 12033 1726867188.50151: starting attempt loop 12033 1726867188.50153: running the handler 12033 1726867188.50155: _low_level_execute_command(): starting 12033 1726867188.50157: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867188.51395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867188.51412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867188.51436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.51510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.53113: stdout chunk (state=3): >>>/root <<< 12033 1726867188.53328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867188.53339: stdout chunk (state=3): >>><<< 12033 1726867188.53350: stderr chunk (state=3): >>><<< 12033 1726867188.53369: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867188.53386: _low_level_execute_command(): starting 12033 1726867188.53398: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867188.5337427-13156-21141267500981 `" && echo ansible-tmp-1726867188.5337427-13156-21141267500981="` echo /root/.ansible/tmp/ansible-tmp-1726867188.5337427-13156-21141267500981 `" ) && sleep 0' 12033 1726867188.54683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867188.54857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867188.54965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867188.55096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.55135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.57040: stdout chunk (state=3): >>>ansible-tmp-1726867188.5337427-13156-21141267500981=/root/.ansible/tmp/ansible-tmp-1726867188.5337427-13156-21141267500981 <<< 12033 1726867188.57207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867188.57210: stdout chunk (state=3): >>><<< 12033 1726867188.57213: stderr chunk (state=3): >>><<< 12033 1726867188.57228: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867188.5337427-13156-21141267500981=/root/.ansible/tmp/ansible-tmp-1726867188.5337427-13156-21141267500981 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867188.57264: variable 'ansible_module_compression' from source: unknown 12033 1726867188.57484: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867188.57487: variable 'ansible_facts' from source: unknown 12033 1726867188.57615: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867188.5337427-13156-21141267500981/AnsiballZ_command.py 12033 1726867188.57800: Sending initial data 12033 1726867188.57817: Sent initial data (155 bytes) 12033 1726867188.58585: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.58616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.60141: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867188.60199: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867188.60274: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmp2ffgx23g /root/.ansible/tmp/ansible-tmp-1726867188.5337427-13156-21141267500981/AnsiballZ_command.py <<< 12033 1726867188.60285: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867188.5337427-13156-21141267500981/AnsiballZ_command.py" <<< 12033 1726867188.60353: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmp2ffgx23g" to remote "/root/.ansible/tmp/ansible-tmp-1726867188.5337427-13156-21141267500981/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867188.5337427-13156-21141267500981/AnsiballZ_command.py" <<< 12033 1726867188.61105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867188.61149: stderr chunk (state=3): >>><<< 12033 1726867188.61160: stdout chunk (state=3): >>><<< 12033 1726867188.61201: done transferring module to remote 12033 1726867188.61214: _low_level_execute_command(): starting 12033 1726867188.61239: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867188.5337427-13156-21141267500981/ /root/.ansible/tmp/ansible-tmp-1726867188.5337427-13156-21141267500981/AnsiballZ_command.py && sleep 0' 12033 1726867188.62181: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867188.62185: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867188.62211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.62282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.64018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867188.64060: stderr chunk (state=3): >>><<< 12033 1726867188.64081: stdout chunk (state=3): >>><<< 12033 1726867188.64100: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867188.64103: _low_level_execute_command(): starting 12033 1726867188.64106: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867188.5337427-13156-21141267500981/AnsiballZ_command.py && sleep 0' 12033 1726867188.64882: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867188.64886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867188.64888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867188.64894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867188.64896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867188.64898: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867188.64900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867188.64902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867188.64908: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867188.64913: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867188.64915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867188.64917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867188.64919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867188.64921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867188.64924: stderr chunk (state=3): >>>debug2: match found <<< 12033 1726867188.64926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867188.64930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867188.64932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867188.64934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.64976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.80511: stdout chunk (state=3): >>> {"changed": true, "stdout": "225", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/resend_igmp"], "start": "2024-09-20 17:19:48.799539", "end": "2024-09-20 17:19:48.802544", "delta": "0:00:00.003005", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/resend_igmp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867188.82284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867188.82288: stdout chunk (state=3): >>><<< 12033 1726867188.82293: stderr chunk (state=3): >>><<< 12033 1726867188.82296: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "225", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/resend_igmp"], "start": "2024-09-20 17:19:48.799539", "end": "2024-09-20 17:19:48.802544", "delta": "0:00:00.003005", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/resend_igmp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867188.82299: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/resend_igmp', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867188.5337427-13156-21141267500981/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867188.82301: _low_level_execute_command(): starting 12033 1726867188.82303: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867188.5337427-13156-21141267500981/ > /dev/null 2>&1 && sleep 0' 12033 1726867188.82808: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867188.82812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867188.82824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867188.82851: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867188.82855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867188.82857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867188.82913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867188.82917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.82965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.85075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867188.85085: stdout chunk (state=3): >>><<< 12033 1726867188.85088: stderr chunk (state=3): >>><<< 12033 1726867188.85093: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867188.85095: handler run complete 12033 1726867188.85238: Evaluated conditional (False): False 12033 1726867188.85682: variable 'bond_opt' from source: unknown 12033 1726867188.85686: variable 'result' from source: unknown 12033 1726867188.85709: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867188.85712: attempt loop complete, returning result 12033 1726867188.85714: variable 'bond_opt' from source: unknown 12033 1726867188.85868: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'resend_igmp', 'value': '225'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "resend_igmp", "value": "225" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/resend_igmp" ], "delta": "0:00:00.003005", "end": "2024-09-20 17:19:48.802544", "rc": 0, "start": "2024-09-20 17:19:48.799539" } STDOUT: 225 12033 1726867188.86485: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867188.86489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867188.86494: variable 'omit' from source: magic vars 12033 1726867188.86837: variable 'ansible_distribution_major_version' from source: facts 12033 1726867188.86841: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867188.86843: variable 'omit' from source: magic vars 12033 1726867188.86945: variable 'omit' from source: magic vars 12033 1726867188.87407: variable 'controller_device' from source: play vars 12033 1726867188.87410: variable 'bond_opt' from source: unknown 12033 1726867188.87412: variable 'omit' from source: magic vars 12033 1726867188.87414: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867188.87416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867188.87419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867188.87768: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867188.87771: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867188.87774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867188.87776: Set connection var ansible_pipelining to False 12033 1726867188.87780: Set connection var ansible_shell_executable to /bin/sh 12033 1726867188.87806: Set connection var ansible_timeout to 10 12033 1726867188.87820: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867188.87900: Set connection var ansible_connection to ssh 12033 1726867188.87905: Set connection var ansible_shell_type to sh 12033 1726867188.87929: variable 'ansible_shell_executable' from source: unknown 12033 1726867188.87937: variable 'ansible_connection' from source: unknown 12033 1726867188.87944: variable 'ansible_module_compression' from source: unknown 12033 1726867188.87951: variable 'ansible_shell_type' from source: unknown 12033 1726867188.87958: variable 'ansible_shell_executable' from source: unknown 12033 1726867188.87993: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867188.88008: variable 'ansible_pipelining' from source: unknown 12033 1726867188.88016: variable 'ansible_timeout' from source: unknown 12033 1726867188.88023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867188.88239: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867188.88392: variable 'omit' from source: magic vars 12033 1726867188.88396: starting attempt loop 12033 1726867188.88399: running the handler 12033 1726867188.88404: _low_level_execute_command(): starting 12033 1726867188.88406: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867188.89304: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867188.89372: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867188.89440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867188.89459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867188.89481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.89561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.91393: stdout chunk (state=3): >>>/root <<< 12033 1726867188.91396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867188.91399: stdout chunk (state=3): >>><<< 12033 1726867188.91401: stderr chunk (state=3): >>><<< 12033 1726867188.91403: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867188.91492: _low_level_execute_command(): starting 12033 1726867188.91498: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867188.913853-13156-6870210140033 `" && echo ansible-tmp-1726867188.913853-13156-6870210140033="` echo /root/.ansible/tmp/ansible-tmp-1726867188.913853-13156-6870210140033 `" ) && sleep 0' 12033 1726867188.92181: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867188.92335: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867188.92342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867188.92350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867188.92353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.92445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.94293: stdout chunk (state=3): >>>ansible-tmp-1726867188.913853-13156-6870210140033=/root/.ansible/tmp/ansible-tmp-1726867188.913853-13156-6870210140033 <<< 12033 1726867188.94399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867188.94427: stderr chunk (state=3): >>><<< 12033 1726867188.94430: stdout chunk (state=3): >>><<< 12033 1726867188.94442: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867188.913853-13156-6870210140033=/root/.ansible/tmp/ansible-tmp-1726867188.913853-13156-6870210140033 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867188.94458: variable 'ansible_module_compression' from source: unknown 12033 1726867188.94486: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867188.94501: variable 'ansible_facts' from source: unknown 12033 1726867188.94547: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867188.913853-13156-6870210140033/AnsiballZ_command.py 12033 1726867188.94626: Sending initial data 12033 1726867188.94630: Sent initial data (153 bytes) 12033 1726867188.95034: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867188.95037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867188.95039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867188.95041: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867188.95043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867188.95098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867188.95105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.95140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867188.96817: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867188.96862: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867188.96906: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpb69del9r /root/.ansible/tmp/ansible-tmp-1726867188.913853-13156-6870210140033/AnsiballZ_command.py <<< 12033 1726867188.96915: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867188.913853-13156-6870210140033/AnsiballZ_command.py" <<< 12033 1726867188.96975: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpb69del9r" to remote "/root/.ansible/tmp/ansible-tmp-1726867188.913853-13156-6870210140033/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867188.913853-13156-6870210140033/AnsiballZ_command.py" <<< 12033 1726867188.97649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867188.97662: stderr chunk (state=3): >>><<< 12033 1726867188.97678: stdout chunk (state=3): >>><<< 12033 1726867188.97707: done transferring module to remote 12033 1726867188.97714: _low_level_execute_command(): starting 12033 1726867188.97724: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867188.913853-13156-6870210140033/ /root/.ansible/tmp/ansible-tmp-1726867188.913853-13156-6870210140033/AnsiballZ_command.py && sleep 0' 12033 1726867188.98369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867188.98404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867188.98408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867188.98492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867189.00362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867189.00365: stdout chunk (state=3): >>><<< 12033 1726867189.00376: stderr chunk (state=3): >>><<< 12033 1726867189.00381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867189.00383: _low_level_execute_command(): starting 12033 1726867189.00385: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867188.913853-13156-6870210140033/AnsiballZ_command.py && sleep 0' 12033 1726867189.01011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867189.01014: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867189.01028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867189.01036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867189.01221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867189.01226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867189.01228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867189.01230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867189.01300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867189.16648: stdout chunk (state=3): >>> {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/updelay"], "start": "2024-09-20 17:19:49.160993", "end": "2024-09-20 17:19:49.164155", "delta": "0:00:00.003162", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/updelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867189.18135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867189.18174: stderr chunk (state=3): >>><<< 12033 1726867189.18179: stdout chunk (state=3): >>><<< 12033 1726867189.18196: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/updelay"], "start": "2024-09-20 17:19:49.160993", "end": "2024-09-20 17:19:49.164155", "delta": "0:00:00.003162", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/updelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867189.18216: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/updelay', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867188.913853-13156-6870210140033/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867189.18223: _low_level_execute_command(): starting 12033 1726867189.18228: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867188.913853-13156-6870210140033/ > /dev/null 2>&1 && sleep 0' 12033 1726867189.18794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867189.18798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867189.18800: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867189.18804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867189.18807: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867189.18842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867189.18850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867189.18851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867189.18908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867189.20715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867189.20739: stderr chunk (state=3): >>><<< 12033 1726867189.20742: stdout chunk (state=3): >>><<< 12033 1726867189.20754: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867189.20759: handler run complete 12033 1726867189.20780: Evaluated conditional (False): False 12033 1726867189.20889: variable 'bond_opt' from source: unknown 12033 1726867189.20897: variable 'result' from source: unknown 12033 1726867189.20908: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867189.20916: attempt loop complete, returning result 12033 1726867189.20930: variable 'bond_opt' from source: unknown 12033 1726867189.20978: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'updelay', 'value': '0'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "updelay", "value": "0" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/updelay" ], "delta": "0:00:00.003162", "end": "2024-09-20 17:19:49.164155", "rc": 0, "start": "2024-09-20 17:19:49.160993" } STDOUT: 0 12033 1726867189.21107: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867189.21110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867189.21113: variable 'omit' from source: magic vars 12033 1726867189.21220: variable 'ansible_distribution_major_version' from source: facts 12033 1726867189.21233: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867189.21236: variable 'omit' from source: magic vars 12033 1726867189.21246: variable 'omit' from source: magic vars 12033 1726867189.21357: variable 'controller_device' from source: play vars 12033 1726867189.21361: variable 'bond_opt' from source: unknown 12033 1726867189.21375: variable 'omit' from source: magic vars 12033 1726867189.21397: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867189.21404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867189.21409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867189.21419: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867189.21422: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867189.21424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867189.21480: Set connection var ansible_pipelining to False 12033 1726867189.21486: Set connection var ansible_shell_executable to /bin/sh 12033 1726867189.21495: Set connection var ansible_timeout to 10 12033 1726867189.21498: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867189.21500: Set connection var ansible_connection to ssh 12033 1726867189.21504: Set connection var ansible_shell_type to sh 12033 1726867189.21518: variable 'ansible_shell_executable' from source: unknown 12033 1726867189.21521: variable 'ansible_connection' from source: unknown 12033 1726867189.21523: variable 'ansible_module_compression' from source: unknown 12033 1726867189.21525: variable 'ansible_shell_type' from source: unknown 12033 1726867189.21528: variable 'ansible_shell_executable' from source: unknown 12033 1726867189.21530: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867189.21534: variable 'ansible_pipelining' from source: unknown 12033 1726867189.21536: variable 'ansible_timeout' from source: unknown 12033 1726867189.21540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867189.21606: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867189.21613: variable 'omit' from source: magic vars 12033 1726867189.21616: starting attempt loop 12033 1726867189.21618: running the handler 12033 1726867189.21624: _low_level_execute_command(): starting 12033 1726867189.21630: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867189.22141: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867189.22144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867189.22190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867189.22193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867189.22253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867189.23809: stdout chunk (state=3): >>>/root <<< 12033 1726867189.23916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867189.23937: stderr chunk (state=3): >>><<< 12033 1726867189.23940: stdout chunk (state=3): >>><<< 12033 1726867189.23951: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867189.23958: _low_level_execute_command(): starting 12033 1726867189.23963: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867189.239505-13156-97048207325552 `" && echo ansible-tmp-1726867189.239505-13156-97048207325552="` echo /root/.ansible/tmp/ansible-tmp-1726867189.239505-13156-97048207325552 `" ) && sleep 0' 12033 1726867189.24430: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867189.24438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867189.24442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867189.24444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867189.24558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867189.24563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867189.24574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867189.26425: stdout chunk (state=3): >>>ansible-tmp-1726867189.239505-13156-97048207325552=/root/.ansible/tmp/ansible-tmp-1726867189.239505-13156-97048207325552 <<< 12033 1726867189.26539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867189.26573: stderr chunk (state=3): >>><<< 12033 1726867189.26576: stdout chunk (state=3): >>><<< 12033 1726867189.26590: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867189.239505-13156-97048207325552=/root/.ansible/tmp/ansible-tmp-1726867189.239505-13156-97048207325552 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867189.26608: variable 'ansible_module_compression' from source: unknown 12033 1726867189.26637: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867189.26654: variable 'ansible_facts' from source: unknown 12033 1726867189.26696: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867189.239505-13156-97048207325552/AnsiballZ_command.py 12033 1726867189.26995: Sending initial data 12033 1726867189.26999: Sent initial data (154 bytes) 12033 1726867189.27532: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867189.27557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867189.27560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867189.27684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867189.27708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867189.27806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867189.29339: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 12033 1726867189.29345: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867189.29397: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867189.29436: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpjtv6t639 /root/.ansible/tmp/ansible-tmp-1726867189.239505-13156-97048207325552/AnsiballZ_command.py <<< 12033 1726867189.29445: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867189.239505-13156-97048207325552/AnsiballZ_command.py" <<< 12033 1726867189.29528: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpjtv6t639" to remote "/root/.ansible/tmp/ansible-tmp-1726867189.239505-13156-97048207325552/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867189.239505-13156-97048207325552/AnsiballZ_command.py" <<< 12033 1726867189.30196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867189.30227: stderr chunk (state=3): >>><<< 12033 1726867189.30232: stdout chunk (state=3): >>><<< 12033 1726867189.30245: done transferring module to remote 12033 1726867189.30251: _low_level_execute_command(): starting 12033 1726867189.30256: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867189.239505-13156-97048207325552/ /root/.ansible/tmp/ansible-tmp-1726867189.239505-13156-97048207325552/AnsiballZ_command.py && sleep 0' 12033 1726867189.30656: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867189.30663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867189.30695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867189.30698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867189.30700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867189.30752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867189.30755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867189.30807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867189.32634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867189.32638: stdout chunk (state=3): >>><<< 12033 1726867189.32640: stderr chunk (state=3): >>><<< 12033 1726867189.32643: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867189.32645: _low_level_execute_command(): starting 12033 1726867189.32647: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867189.239505-13156-97048207325552/AnsiballZ_command.py && sleep 0' 12033 1726867189.33483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867189.33487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867189.33489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867189.33495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867189.33497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867189.33500: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867189.33502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867189.33504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867189.33506: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867189.33508: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867189.33510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867189.33512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867189.33514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867189.33516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867189.33599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867189.48912: stdout chunk (state=3): >>> {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/use_carrier"], "start": "2024-09-20 17:19:49.483599", "end": "2024-09-20 17:19:49.486707", "delta": "0:00:00.003108", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/use_carrier", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867189.50659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867189.50664: stdout chunk (state=3): >>><<< 12033 1726867189.50667: stderr chunk (state=3): >>><<< 12033 1726867189.50692: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/use_carrier"], "start": "2024-09-20 17:19:49.483599", "end": "2024-09-20 17:19:49.486707", "delta": "0:00:00.003108", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/use_carrier", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867189.50722: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/use_carrier', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867189.239505-13156-97048207325552/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867189.50730: _low_level_execute_command(): starting 12033 1726867189.50732: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867189.239505-13156-97048207325552/ > /dev/null 2>&1 && sleep 0' 12033 1726867189.52023: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867189.52072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867189.52171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867189.52174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867189.52217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867189.54196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867189.54199: stdout chunk (state=3): >>><<< 12033 1726867189.54202: stderr chunk (state=3): >>><<< 12033 1726867189.54205: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867189.54207: handler run complete 12033 1726867189.54209: Evaluated conditional (False): False 12033 1726867189.54454: variable 'bond_opt' from source: unknown 12033 1726867189.54466: variable 'result' from source: unknown 12033 1726867189.54493: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867189.54513: attempt loop complete, returning result 12033 1726867189.54536: variable 'bond_opt' from source: unknown 12033 1726867189.54788: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'use_carrier', 'value': '1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "use_carrier", "value": "1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/use_carrier" ], "delta": "0:00:00.003108", "end": "2024-09-20 17:19:49.486707", "rc": 0, "start": "2024-09-20 17:19:49.483599" } STDOUT: 1 12033 1726867189.54913: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867189.54916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867189.54918: variable 'omit' from source: magic vars 12033 1726867189.55028: variable 'ansible_distribution_major_version' from source: facts 12033 1726867189.55039: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867189.55047: variable 'omit' from source: magic vars 12033 1726867189.55065: variable 'omit' from source: magic vars 12033 1726867189.55307: variable 'controller_device' from source: play vars 12033 1726867189.55310: variable 'bond_opt' from source: unknown 12033 1726867189.55319: variable 'omit' from source: magic vars 12033 1726867189.55354: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867189.55366: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867189.55444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867189.55451: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867189.55453: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867189.55456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867189.55557: Set connection var ansible_pipelining to False 12033 1726867189.55560: Set connection var ansible_shell_executable to /bin/sh 12033 1726867189.55562: Set connection var ansible_timeout to 10 12033 1726867189.55566: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867189.55571: Set connection var ansible_connection to ssh 12033 1726867189.55573: Set connection var ansible_shell_type to sh 12033 1726867189.55582: variable 'ansible_shell_executable' from source: unknown 12033 1726867189.55589: variable 'ansible_connection' from source: unknown 12033 1726867189.55599: variable 'ansible_module_compression' from source: unknown 12033 1726867189.55606: variable 'ansible_shell_type' from source: unknown 12033 1726867189.55612: variable 'ansible_shell_executable' from source: unknown 12033 1726867189.55619: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867189.55626: variable 'ansible_pipelining' from source: unknown 12033 1726867189.55665: variable 'ansible_timeout' from source: unknown 12033 1726867189.55669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867189.55748: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867189.55761: variable 'omit' from source: magic vars 12033 1726867189.55776: starting attempt loop 12033 1726867189.55808: running the handler 12033 1726867189.56096: _low_level_execute_command(): starting 12033 1726867189.56100: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867189.56955: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867189.56962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867189.56984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867189.57050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867189.58646: stdout chunk (state=3): >>>/root <<< 12033 1726867189.58771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867189.58774: stdout chunk (state=3): >>><<< 12033 1726867189.58783: stderr chunk (state=3): >>><<< 12033 1726867189.58799: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867189.58807: _low_level_execute_command(): starting 12033 1726867189.58812: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867189.5879774-13156-92868920188082 `" && echo ansible-tmp-1726867189.5879774-13156-92868920188082="` echo /root/.ansible/tmp/ansible-tmp-1726867189.5879774-13156-92868920188082 `" ) && sleep 0' 12033 1726867189.60101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867189.60164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867189.60246: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867189.60263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867189.60340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867189.62217: stdout chunk (state=3): >>>ansible-tmp-1726867189.5879774-13156-92868920188082=/root/.ansible/tmp/ansible-tmp-1726867189.5879774-13156-92868920188082 <<< 12033 1726867189.62344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867189.62347: stderr chunk (state=3): >>><<< 12033 1726867189.62352: stdout chunk (state=3): >>><<< 12033 1726867189.62370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867189.5879774-13156-92868920188082=/root/.ansible/tmp/ansible-tmp-1726867189.5879774-13156-92868920188082 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867189.62396: variable 'ansible_module_compression' from source: unknown 12033 1726867189.62431: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867189.62449: variable 'ansible_facts' from source: unknown 12033 1726867189.62524: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867189.5879774-13156-92868920188082/AnsiballZ_command.py 12033 1726867189.63183: Sending initial data 12033 1726867189.63187: Sent initial data (155 bytes) 12033 1726867189.64853: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867189.65047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867189.65064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867189.65171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867189.66906: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867189.66914: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867189.66964: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpdtdgauze /root/.ansible/tmp/ansible-tmp-1726867189.5879774-13156-92868920188082/AnsiballZ_command.py <<< 12033 1726867189.66967: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867189.5879774-13156-92868920188082/AnsiballZ_command.py" <<< 12033 1726867189.67004: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpdtdgauze" to remote "/root/.ansible/tmp/ansible-tmp-1726867189.5879774-13156-92868920188082/AnsiballZ_command.py" <<< 12033 1726867189.67007: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867189.5879774-13156-92868920188082/AnsiballZ_command.py" <<< 12033 1726867189.69048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867189.69052: stderr chunk (state=3): >>><<< 12033 1726867189.69056: stdout chunk (state=3): >>><<< 12033 1726867189.69112: done transferring module to remote 12033 1726867189.69119: _low_level_execute_command(): starting 12033 1726867189.69124: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867189.5879774-13156-92868920188082/ /root/.ansible/tmp/ansible-tmp-1726867189.5879774-13156-92868920188082/AnsiballZ_command.py && sleep 0' 12033 1726867189.70612: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867189.70882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867189.70898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867189.70911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867189.71008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867189.72749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867189.72753: stderr chunk (state=3): >>><<< 12033 1726867189.72758: stdout chunk (state=3): >>><<< 12033 1726867189.72775: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867189.72781: _low_level_execute_command(): starting 12033 1726867189.72784: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867189.5879774-13156-92868920188082/AnsiballZ_command.py && sleep 0' 12033 1726867189.73333: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867189.73343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867189.73383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867189.73386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867189.73389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867189.73391: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867189.73406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867189.73413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867189.73421: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867189.73484: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867189.73487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867189.73489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867189.73491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867189.73493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867189.73495: stderr chunk (state=3): >>>debug2: match found <<< 12033 1726867189.73497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867189.73545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867189.73556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867189.73571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867189.73649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867189.89086: stdout chunk (state=3): >>> {"changed": true, "stdout": "encap2+3 3", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy"], "start": "2024-09-20 17:19:49.884669", "end": "2024-09-20 17:19:49.887643", "delta": "0:00:00.002974", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/xmit_hash_policy", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867189.90710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867189.90715: stdout chunk (state=3): >>><<< 12033 1726867189.90723: stderr chunk (state=3): >>><<< 12033 1726867189.90771: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "encap2+3 3", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy"], "start": "2024-09-20 17:19:49.884669", "end": "2024-09-20 17:19:49.887643", "delta": "0:00:00.002974", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/xmit_hash_policy", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867189.90801: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/xmit_hash_policy', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867189.5879774-13156-92868920188082/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867189.90804: _low_level_execute_command(): starting 12033 1726867189.90988: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867189.5879774-13156-92868920188082/ > /dev/null 2>&1 && sleep 0' 12033 1726867189.92084: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867189.92133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867189.92195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867189.94063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867189.94068: stdout chunk (state=3): >>><<< 12033 1726867189.94093: stderr chunk (state=3): >>><<< 12033 1726867189.94149: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867189.94152: handler run complete 12033 1726867189.94187: Evaluated conditional (False): False 12033 1726867189.94448: variable 'bond_opt' from source: unknown 12033 1726867189.94458: variable 'result' from source: unknown 12033 1726867189.94588: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867189.94599: attempt loop complete, returning result 12033 1726867189.94661: variable 'bond_opt' from source: unknown 12033 1726867189.94797: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'xmit_hash_policy', 'value': 'encap2+3'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "xmit_hash_policy", "value": "encap2+3" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy" ], "delta": "0:00:00.002974", "end": "2024-09-20 17:19:49.887643", "rc": 0, "start": "2024-09-20 17:19:49.884669" } STDOUT: encap2+3 3 12033 1726867189.95088: dumping result to json 12033 1726867189.95114: done dumping result, returning 12033 1726867189.95124: done running TaskExecutor() for managed_node3/TASK: ** TEST check bond settings [0affcac9-a3a5-74bb-502b-000000000400] 12033 1726867189.95129: sending task result for task 0affcac9-a3a5-74bb-502b-000000000400 12033 1726867189.96789: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000400 12033 1726867189.96795: WORKER PROCESS EXITING 12033 1726867189.96911: no more pending results, returning what we have 12033 1726867189.96915: results queue empty 12033 1726867189.96916: checking for any_errors_fatal 12033 1726867189.96925: done checking for any_errors_fatal 12033 1726867189.96926: checking for max_fail_percentage 12033 1726867189.96928: done checking for max_fail_percentage 12033 1726867189.96929: checking to see if all hosts have failed and the running result is not ok 12033 1726867189.96929: done checking to see if all hosts have failed 12033 1726867189.96930: getting the remaining hosts for this loop 12033 1726867189.96931: done getting the remaining hosts for this loop 12033 1726867189.96935: getting the next task for host managed_node3 12033 1726867189.96940: done getting next task for host managed_node3 12033 1726867189.96943: ^ task is: TASK: Include the task 'assert_IPv4_present.yml' 12033 1726867189.96946: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867189.96950: getting variables 12033 1726867189.96951: in VariableManager get_vars() 12033 1726867189.96975: Calling all_inventory to load vars for managed_node3 12033 1726867189.96980: Calling groups_inventory to load vars for managed_node3 12033 1726867189.96983: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867189.96995: Calling all_plugins_play to load vars for managed_node3 12033 1726867189.96997: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867189.97000: Calling groups_plugins_play to load vars for managed_node3 12033 1726867189.99889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867190.01987: done with get_vars() 12033 1726867190.02107: done getting variables TASK [Include the task 'assert_IPv4_present.yml'] ****************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:11 Friday 20 September 2024 17:19:50 -0400 (0:00:05.662) 0:00:29.139 ****** 12033 1726867190.02353: entering _queue_task() for managed_node3/include_tasks 12033 1726867190.03021: worker is 1 (out of 1 available) 12033 1726867190.03033: exiting _queue_task() for managed_node3/include_tasks 12033 1726867190.03046: done queuing things up, now waiting for results queue to drain 12033 1726867190.03048: waiting for pending results... 12033 1726867190.03421: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv4_present.yml' 12033 1726867190.03731: in run() - task 0affcac9-a3a5-74bb-502b-000000000402 12033 1726867190.03736: variable 'ansible_search_path' from source: unknown 12033 1726867190.03739: variable 'ansible_search_path' from source: unknown 12033 1726867190.03976: calling self._execute() 12033 1726867190.04031: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867190.04045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867190.04060: variable 'omit' from source: magic vars 12033 1726867190.04476: variable 'ansible_distribution_major_version' from source: facts 12033 1726867190.04499: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867190.04510: _execute() done 12033 1726867190.04516: dumping result to json 12033 1726867190.04523: done dumping result, returning 12033 1726867190.04541: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv4_present.yml' [0affcac9-a3a5-74bb-502b-000000000402] 12033 1726867190.04551: sending task result for task 0affcac9-a3a5-74bb-502b-000000000402 12033 1726867190.04808: no more pending results, returning what we have 12033 1726867190.04814: in VariableManager get_vars() 12033 1726867190.04855: Calling all_inventory to load vars for managed_node3 12033 1726867190.04858: Calling groups_inventory to load vars for managed_node3 12033 1726867190.04861: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867190.04874: Calling all_plugins_play to load vars for managed_node3 12033 1726867190.05081: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867190.05087: Calling groups_plugins_play to load vars for managed_node3 12033 1726867190.05695: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000402 12033 1726867190.05699: WORKER PROCESS EXITING 12033 1726867190.06640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867190.09239: done with get_vars() 12033 1726867190.09257: variable 'ansible_search_path' from source: unknown 12033 1726867190.09259: variable 'ansible_search_path' from source: unknown 12033 1726867190.09268: variable 'item' from source: include params 12033 1726867190.09381: variable 'item' from source: include params 12033 1726867190.09417: we have included files to process 12033 1726867190.09418: generating all_blocks data 12033 1726867190.09420: done generating all_blocks data 12033 1726867190.09426: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 12033 1726867190.09427: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 12033 1726867190.09430: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 12033 1726867190.09683: done processing included file 12033 1726867190.09685: iterating over new_blocks loaded from include file 12033 1726867190.09687: in VariableManager get_vars() 12033 1726867190.09705: done with get_vars() 12033 1726867190.09707: filtering new block on tags 12033 1726867190.09735: done filtering new block on tags 12033 1726867190.09737: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml for managed_node3 12033 1726867190.09742: extending task lists for all hosts with included blocks 12033 1726867190.09984: done extending task lists 12033 1726867190.09986: done processing included files 12033 1726867190.09986: results queue empty 12033 1726867190.09987: checking for any_errors_fatal 12033 1726867190.10007: done checking for any_errors_fatal 12033 1726867190.10008: checking for max_fail_percentage 12033 1726867190.10009: done checking for max_fail_percentage 12033 1726867190.10010: checking to see if all hosts have failed and the running result is not ok 12033 1726867190.10011: done checking to see if all hosts have failed 12033 1726867190.10012: getting the remaining hosts for this loop 12033 1726867190.10013: done getting the remaining hosts for this loop 12033 1726867190.10015: getting the next task for host managed_node3 12033 1726867190.10020: done getting next task for host managed_node3 12033 1726867190.10023: ^ task is: TASK: ** TEST check IPv4 12033 1726867190.10026: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867190.10028: getting variables 12033 1726867190.10029: in VariableManager get_vars() 12033 1726867190.10039: Calling all_inventory to load vars for managed_node3 12033 1726867190.10041: Calling groups_inventory to load vars for managed_node3 12033 1726867190.10044: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867190.10048: Calling all_plugins_play to load vars for managed_node3 12033 1726867190.10051: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867190.10054: Calling groups_plugins_play to load vars for managed_node3 12033 1726867190.11244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867190.12933: done with get_vars() 12033 1726867190.12959: done getting variables 12033 1726867190.13012: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml:3 Friday 20 September 2024 17:19:50 -0400 (0:00:00.106) 0:00:29.246 ****** 12033 1726867190.13051: entering _queue_task() for managed_node3/command 12033 1726867190.13679: worker is 1 (out of 1 available) 12033 1726867190.13696: exiting _queue_task() for managed_node3/command 12033 1726867190.13714: done queuing things up, now waiting for results queue to drain 12033 1726867190.13716: waiting for pending results... 12033 1726867190.13883: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 12033 1726867190.14049: in run() - task 0affcac9-a3a5-74bb-502b-000000000631 12033 1726867190.14083: variable 'ansible_search_path' from source: unknown 12033 1726867190.14097: variable 'ansible_search_path' from source: unknown 12033 1726867190.14149: calling self._execute() 12033 1726867190.14276: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867190.14294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867190.14317: variable 'omit' from source: magic vars 12033 1726867190.14761: variable 'ansible_distribution_major_version' from source: facts 12033 1726867190.14780: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867190.14797: variable 'omit' from source: magic vars 12033 1726867190.14867: variable 'omit' from source: magic vars 12033 1726867190.15141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867190.17596: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867190.17670: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867190.17714: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867190.17757: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867190.17872: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867190.17876: variable 'interface' from source: include params 12033 1726867190.17887: variable 'controller_device' from source: play vars 12033 1726867190.17953: variable 'controller_device' from source: play vars 12033 1726867190.17996: variable 'omit' from source: magic vars 12033 1726867190.18033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867190.18065: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867190.18106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867190.18122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867190.18139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867190.18236: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867190.18246: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867190.18283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867190.18364: Set connection var ansible_pipelining to False 12033 1726867190.18381: Set connection var ansible_shell_executable to /bin/sh 12033 1726867190.18396: Set connection var ansible_timeout to 10 12033 1726867190.18406: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867190.18432: Set connection var ansible_connection to ssh 12033 1726867190.18434: Set connection var ansible_shell_type to sh 12033 1726867190.18584: variable 'ansible_shell_executable' from source: unknown 12033 1726867190.18588: variable 'ansible_connection' from source: unknown 12033 1726867190.18593: variable 'ansible_module_compression' from source: unknown 12033 1726867190.18596: variable 'ansible_shell_type' from source: unknown 12033 1726867190.18598: variable 'ansible_shell_executable' from source: unknown 12033 1726867190.18601: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867190.18603: variable 'ansible_pipelining' from source: unknown 12033 1726867190.18605: variable 'ansible_timeout' from source: unknown 12033 1726867190.18607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867190.18928: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867190.18932: variable 'omit' from source: magic vars 12033 1726867190.18934: starting attempt loop 12033 1726867190.18937: running the handler 12033 1726867190.18939: _low_level_execute_command(): starting 12033 1726867190.18941: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867190.20299: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867190.20395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867190.20544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867190.20643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867190.22317: stdout chunk (state=3): >>>/root <<< 12033 1726867190.22585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867190.22589: stdout chunk (state=3): >>><<< 12033 1726867190.22591: stderr chunk (state=3): >>><<< 12033 1726867190.22718: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867190.22722: _low_level_execute_command(): starting 12033 1726867190.22726: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867190.2261972-13500-3434141206605 `" && echo ansible-tmp-1726867190.2261972-13500-3434141206605="` echo /root/.ansible/tmp/ansible-tmp-1726867190.2261972-13500-3434141206605 `" ) && sleep 0' 12033 1726867190.23341: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867190.23357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867190.23372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867190.23399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867190.23427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867190.23443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867190.23529: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867190.23552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867190.23627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867190.25560: stdout chunk (state=3): >>>ansible-tmp-1726867190.2261972-13500-3434141206605=/root/.ansible/tmp/ansible-tmp-1726867190.2261972-13500-3434141206605 <<< 12033 1726867190.25772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867190.25780: stdout chunk (state=3): >>><<< 12033 1726867190.25783: stderr chunk (state=3): >>><<< 12033 1726867190.25890: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867190.2261972-13500-3434141206605=/root/.ansible/tmp/ansible-tmp-1726867190.2261972-13500-3434141206605 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867190.25905: variable 'ansible_module_compression' from source: unknown 12033 1726867190.25966: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867190.26018: variable 'ansible_facts' from source: unknown 12033 1726867190.26220: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867190.2261972-13500-3434141206605/AnsiballZ_command.py 12033 1726867190.26302: Sending initial data 12033 1726867190.26311: Sent initial data (154 bytes) 12033 1726867190.27525: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867190.27580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867190.27632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867190.27673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867190.27821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867190.29416: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867190.29816: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867190.2261972-13500-3434141206605/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpl94o6d9b" to remote "/root/.ansible/tmp/ansible-tmp-1726867190.2261972-13500-3434141206605/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867190.2261972-13500-3434141206605/AnsiballZ_command.py" <<< 12033 1726867190.29820: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpl94o6d9b /root/.ansible/tmp/ansible-tmp-1726867190.2261972-13500-3434141206605/AnsiballZ_command.py <<< 12033 1726867190.30720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867190.30833: stderr chunk (state=3): >>><<< 12033 1726867190.30851: stdout chunk (state=3): >>><<< 12033 1726867190.30886: done transferring module to remote 12033 1726867190.30907: _low_level_execute_command(): starting 12033 1726867190.30925: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867190.2261972-13500-3434141206605/ /root/.ansible/tmp/ansible-tmp-1726867190.2261972-13500-3434141206605/AnsiballZ_command.py && sleep 0' 12033 1726867190.31716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867190.31766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867190.31839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867190.31915: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867190.32223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867190.32276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867190.32319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867190.34281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867190.34285: stderr chunk (state=3): >>><<< 12033 1726867190.34287: stdout chunk (state=3): >>><<< 12033 1726867190.34290: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867190.34295: _low_level_execute_command(): starting 12033 1726867190.34297: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867190.2261972-13500-3434141206605/AnsiballZ_command.py && sleep 0' 12033 1726867190.35407: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867190.35436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867190.35511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867190.35602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867190.51076: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.10/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 229sec preferred_lft 229sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 17:19:50.503648", "end": "2024-09-20 17:19:50.507201", "delta": "0:00:00.003553", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867190.52462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867190.52615: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 12033 1726867190.52618: stdout chunk (state=3): >>><<< 12033 1726867190.52624: stderr chunk (state=3): >>><<< 12033 1726867190.52643: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.10/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 229sec preferred_lft 229sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 17:19:50.503648", "end": "2024-09-20 17:19:50.507201", "delta": "0:00:00.003553", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867190.52676: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867190.2261972-13500-3434141206605/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867190.52726: _low_level_execute_command(): starting 12033 1726867190.52729: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867190.2261972-13500-3434141206605/ > /dev/null 2>&1 && sleep 0' 12033 1726867190.54111: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867190.54116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867190.54252: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867190.54255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867190.54258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867190.54261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867190.54394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867190.54410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867190.54655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867190.56417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867190.56541: stderr chunk (state=3): >>><<< 12033 1726867190.56545: stdout chunk (state=3): >>><<< 12033 1726867190.56566: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867190.56571: handler run complete 12033 1726867190.56719: Evaluated conditional (False): False 12033 1726867190.57190: variable 'address' from source: include params 12033 1726867190.57198: variable 'result' from source: set_fact 12033 1726867190.57215: Evaluated conditional (address in result.stdout): True 12033 1726867190.57227: attempt loop complete, returning result 12033 1726867190.57230: _execute() done 12033 1726867190.57399: dumping result to json 12033 1726867190.57416: done dumping result, returning 12033 1726867190.57472: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 [0affcac9-a3a5-74bb-502b-000000000631] 12033 1726867190.57494: sending task result for task 0affcac9-a3a5-74bb-502b-000000000631 ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003553", "end": "2024-09-20 17:19:50.507201", "rc": 0, "start": "2024-09-20 17:19:50.503648" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.10/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 229sec preferred_lft 229sec 12033 1726867190.57723: no more pending results, returning what we have 12033 1726867190.57728: results queue empty 12033 1726867190.57729: checking for any_errors_fatal 12033 1726867190.57731: done checking for any_errors_fatal 12033 1726867190.57731: checking for max_fail_percentage 12033 1726867190.57733: done checking for max_fail_percentage 12033 1726867190.57734: checking to see if all hosts have failed and the running result is not ok 12033 1726867190.57735: done checking to see if all hosts have failed 12033 1726867190.57736: getting the remaining hosts for this loop 12033 1726867190.57738: done getting the remaining hosts for this loop 12033 1726867190.57742: getting the next task for host managed_node3 12033 1726867190.57753: done getting next task for host managed_node3 12033 1726867190.57757: ^ task is: TASK: Include the task 'assert_IPv6_present.yml' 12033 1726867190.57761: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867190.57766: getting variables 12033 1726867190.57768: in VariableManager get_vars() 12033 1726867190.57803: Calling all_inventory to load vars for managed_node3 12033 1726867190.57806: Calling groups_inventory to load vars for managed_node3 12033 1726867190.57809: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867190.57820: Calling all_plugins_play to load vars for managed_node3 12033 1726867190.57823: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867190.57826: Calling groups_plugins_play to load vars for managed_node3 12033 1726867190.58537: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000631 12033 1726867190.58540: WORKER PROCESS EXITING 12033 1726867190.60141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867190.62112: done with get_vars() 12033 1726867190.62142: done getting variables TASK [Include the task 'assert_IPv6_present.yml'] ****************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:16 Friday 20 September 2024 17:19:50 -0400 (0:00:00.493) 0:00:29.739 ****** 12033 1726867190.62398: entering _queue_task() for managed_node3/include_tasks 12033 1726867190.63153: worker is 1 (out of 1 available) 12033 1726867190.63167: exiting _queue_task() for managed_node3/include_tasks 12033 1726867190.63184: done queuing things up, now waiting for results queue to drain 12033 1726867190.63186: waiting for pending results... 12033 1726867190.63960: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv6_present.yml' 12033 1726867190.64388: in run() - task 0affcac9-a3a5-74bb-502b-000000000403 12033 1726867190.64395: variable 'ansible_search_path' from source: unknown 12033 1726867190.64402: variable 'ansible_search_path' from source: unknown 12033 1726867190.64405: calling self._execute() 12033 1726867190.64607: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867190.64621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867190.64638: variable 'omit' from source: magic vars 12033 1726867190.65412: variable 'ansible_distribution_major_version' from source: facts 12033 1726867190.65495: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867190.65514: _execute() done 12033 1726867190.65556: dumping result to json 12033 1726867190.65565: done dumping result, returning 12033 1726867190.65583: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv6_present.yml' [0affcac9-a3a5-74bb-502b-000000000403] 12033 1726867190.65703: sending task result for task 0affcac9-a3a5-74bb-502b-000000000403 12033 1726867190.66188: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000403 12033 1726867190.66196: WORKER PROCESS EXITING 12033 1726867190.66227: no more pending results, returning what we have 12033 1726867190.66232: in VariableManager get_vars() 12033 1726867190.66274: Calling all_inventory to load vars for managed_node3 12033 1726867190.66278: Calling groups_inventory to load vars for managed_node3 12033 1726867190.66485: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867190.66500: Calling all_plugins_play to load vars for managed_node3 12033 1726867190.66504: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867190.66507: Calling groups_plugins_play to load vars for managed_node3 12033 1726867190.69044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867190.71519: done with get_vars() 12033 1726867190.71542: variable 'ansible_search_path' from source: unknown 12033 1726867190.71544: variable 'ansible_search_path' from source: unknown 12033 1726867190.71555: variable 'item' from source: include params 12033 1726867190.71678: variable 'item' from source: include params 12033 1726867190.71713: we have included files to process 12033 1726867190.71715: generating all_blocks data 12033 1726867190.71716: done generating all_blocks data 12033 1726867190.71722: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 12033 1726867190.71723: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 12033 1726867190.71725: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 12033 1726867190.71971: done processing included file 12033 1726867190.71973: iterating over new_blocks loaded from include file 12033 1726867190.71974: in VariableManager get_vars() 12033 1726867190.72000: done with get_vars() 12033 1726867190.72002: filtering new block on tags 12033 1726867190.72029: done filtering new block on tags 12033 1726867190.72032: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml for managed_node3 12033 1726867190.72038: extending task lists for all hosts with included blocks 12033 1726867190.72404: done extending task lists 12033 1726867190.72405: done processing included files 12033 1726867190.72406: results queue empty 12033 1726867190.72407: checking for any_errors_fatal 12033 1726867190.72411: done checking for any_errors_fatal 12033 1726867190.72412: checking for max_fail_percentage 12033 1726867190.72414: done checking for max_fail_percentage 12033 1726867190.72414: checking to see if all hosts have failed and the running result is not ok 12033 1726867190.72415: done checking to see if all hosts have failed 12033 1726867190.72416: getting the remaining hosts for this loop 12033 1726867190.72417: done getting the remaining hosts for this loop 12033 1726867190.72424: getting the next task for host managed_node3 12033 1726867190.72433: done getting next task for host managed_node3 12033 1726867190.72438: ^ task is: TASK: ** TEST check IPv6 12033 1726867190.72441: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867190.72444: getting variables 12033 1726867190.72445: in VariableManager get_vars() 12033 1726867190.72453: Calling all_inventory to load vars for managed_node3 12033 1726867190.72456: Calling groups_inventory to load vars for managed_node3 12033 1726867190.72458: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867190.72463: Calling all_plugins_play to load vars for managed_node3 12033 1726867190.72466: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867190.72468: Calling groups_plugins_play to load vars for managed_node3 12033 1726867190.73849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867190.75893: done with get_vars() 12033 1726867190.75923: done getting variables 12033 1726867190.75967: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml:3 Friday 20 September 2024 17:19:50 -0400 (0:00:00.137) 0:00:29.877 ****** 12033 1726867190.76147: entering _queue_task() for managed_node3/command 12033 1726867190.76937: worker is 1 (out of 1 available) 12033 1726867190.76950: exiting _queue_task() for managed_node3/command 12033 1726867190.76963: done queuing things up, now waiting for results queue to drain 12033 1726867190.76965: waiting for pending results... 12033 1726867190.77795: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 12033 1726867190.77826: in run() - task 0affcac9-a3a5-74bb-502b-000000000652 12033 1726867190.77848: variable 'ansible_search_path' from source: unknown 12033 1726867190.78037: variable 'ansible_search_path' from source: unknown 12033 1726867190.78041: calling self._execute() 12033 1726867190.78171: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867190.78187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867190.78205: variable 'omit' from source: magic vars 12033 1726867190.78990: variable 'ansible_distribution_major_version' from source: facts 12033 1726867190.79057: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867190.79069: variable 'omit' from source: magic vars 12033 1726867190.79184: variable 'omit' from source: magic vars 12033 1726867190.79536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867190.84505: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867190.84629: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867190.84715: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867190.84819: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867190.84916: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867190.85216: variable 'controller_device' from source: play vars 12033 1726867190.85220: variable 'omit' from source: magic vars 12033 1726867190.85223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867190.85225: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867190.85332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867190.85353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867190.85367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867190.85464: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867190.85472: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867190.85488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867190.85713: Set connection var ansible_pipelining to False 12033 1726867190.85729: Set connection var ansible_shell_executable to /bin/sh 12033 1726867190.85772: Set connection var ansible_timeout to 10 12033 1726867190.85784: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867190.85793: Set connection var ansible_connection to ssh 12033 1726867190.85854: Set connection var ansible_shell_type to sh 12033 1726867190.85894: variable 'ansible_shell_executable' from source: unknown 12033 1726867190.85908: variable 'ansible_connection' from source: unknown 12033 1726867190.85920: variable 'ansible_module_compression' from source: unknown 12033 1726867190.85931: variable 'ansible_shell_type' from source: unknown 12033 1726867190.85963: variable 'ansible_shell_executable' from source: unknown 12033 1726867190.85972: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867190.85984: variable 'ansible_pipelining' from source: unknown 12033 1726867190.86058: variable 'ansible_timeout' from source: unknown 12033 1726867190.86061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867190.86121: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867190.86137: variable 'omit' from source: magic vars 12033 1726867190.86147: starting attempt loop 12033 1726867190.86153: running the handler 12033 1726867190.86184: _low_level_execute_command(): starting 12033 1726867190.86196: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867190.86883: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867190.86903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867190.86984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867190.86999: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867190.87049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867190.87073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867190.87115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867190.87171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867190.88873: stdout chunk (state=3): >>>/root <<< 12033 1726867190.89008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867190.89020: stderr chunk (state=3): >>><<< 12033 1726867190.89029: stdout chunk (state=3): >>><<< 12033 1726867190.89054: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867190.89087: _low_level_execute_command(): starting 12033 1726867190.89155: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867190.8907182-13533-57116145014547 `" && echo ansible-tmp-1726867190.8907182-13533-57116145014547="` echo /root/.ansible/tmp/ansible-tmp-1726867190.8907182-13533-57116145014547 `" ) && sleep 0' 12033 1726867190.89681: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867190.89696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867190.89806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867190.89828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867190.89851: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867190.89876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867190.89962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867190.91871: stdout chunk (state=3): >>>ansible-tmp-1726867190.8907182-13533-57116145014547=/root/.ansible/tmp/ansible-tmp-1726867190.8907182-13533-57116145014547 <<< 12033 1726867190.91994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867190.92012: stderr chunk (state=3): >>><<< 12033 1726867190.92022: stdout chunk (state=3): >>><<< 12033 1726867190.92108: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867190.8907182-13533-57116145014547=/root/.ansible/tmp/ansible-tmp-1726867190.8907182-13533-57116145014547 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867190.92111: variable 'ansible_module_compression' from source: unknown 12033 1726867190.92114: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867190.92116: variable 'ansible_facts' from source: unknown 12033 1726867190.92173: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867190.8907182-13533-57116145014547/AnsiballZ_command.py 12033 1726867190.92265: Sending initial data 12033 1726867190.92268: Sent initial data (155 bytes) 12033 1726867190.92735: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867190.92739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867190.92742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867190.92745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867190.92748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867190.92772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867190.92807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867190.92858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867190.94449: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867190.94510: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867190.94542: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmppsbcmncc /root/.ansible/tmp/ansible-tmp-1726867190.8907182-13533-57116145014547/AnsiballZ_command.py <<< 12033 1726867190.94546: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867190.8907182-13533-57116145014547/AnsiballZ_command.py" <<< 12033 1726867190.94616: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmppsbcmncc" to remote "/root/.ansible/tmp/ansible-tmp-1726867190.8907182-13533-57116145014547/AnsiballZ_command.py" <<< 12033 1726867190.94619: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867190.8907182-13533-57116145014547/AnsiballZ_command.py" <<< 12033 1726867190.95173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867190.95213: stderr chunk (state=3): >>><<< 12033 1726867190.95217: stdout chunk (state=3): >>><<< 12033 1726867190.95242: done transferring module to remote 12033 1726867190.95250: _low_level_execute_command(): starting 12033 1726867190.95254: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867190.8907182-13533-57116145014547/ /root/.ansible/tmp/ansible-tmp-1726867190.8907182-13533-57116145014547/AnsiballZ_command.py && sleep 0' 12033 1726867190.95660: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867190.95663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867190.95669: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867190.95671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867190.95717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867190.95724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867190.95772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867190.97623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867190.97627: stdout chunk (state=3): >>><<< 12033 1726867190.97629: stderr chunk (state=3): >>><<< 12033 1726867190.97632: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867190.97638: _low_level_execute_command(): starting 12033 1726867190.97641: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867190.8907182-13533-57116145014547/AnsiballZ_command.py && sleep 0' 12033 1726867190.98423: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867190.98431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867190.98709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867190.98713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867190.98715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867190.98722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867190.98823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867191.14288: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::17d/128 scope global dynamic noprefixroute \n valid_lft 229sec preferred_lft 229sec\n inet6 2001:db8::7855:c7ff:fea0:6986/64 scope global dynamic noprefixroute \n valid_lft 1795sec preferred_lft 1795sec\n inet6 fe80::7855:c7ff:fea0:6986/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 17:19:51.136825", "end": "2024-09-20 17:19:51.140408", "delta": "0:00:00.003583", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867191.15734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867191.15767: stderr chunk (state=3): >>><<< 12033 1726867191.15770: stdout chunk (state=3): >>><<< 12033 1726867191.15790: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::17d/128 scope global dynamic noprefixroute \n valid_lft 229sec preferred_lft 229sec\n inet6 2001:db8::7855:c7ff:fea0:6986/64 scope global dynamic noprefixroute \n valid_lft 1795sec preferred_lft 1795sec\n inet6 fe80::7855:c7ff:fea0:6986/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 17:19:51.136825", "end": "2024-09-20 17:19:51.140408", "delta": "0:00:00.003583", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867191.15828: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867190.8907182-13533-57116145014547/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867191.15835: _low_level_execute_command(): starting 12033 1726867191.15838: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867190.8907182-13533-57116145014547/ > /dev/null 2>&1 && sleep 0' 12033 1726867191.16282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867191.16285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867191.16319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867191.16322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867191.16325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867191.16327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867191.16329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867191.16388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867191.16393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867191.16395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867191.16439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867191.18270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867191.18304: stderr chunk (state=3): >>><<< 12033 1726867191.18308: stdout chunk (state=3): >>><<< 12033 1726867191.18323: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867191.18328: handler run complete 12033 1726867191.18345: Evaluated conditional (False): False 12033 1726867191.18472: variable 'address' from source: include params 12033 1726867191.18475: variable 'result' from source: set_fact 12033 1726867191.18493: Evaluated conditional (address in result.stdout): True 12033 1726867191.18501: attempt loop complete, returning result 12033 1726867191.18504: _execute() done 12033 1726867191.18506: dumping result to json 12033 1726867191.18511: done dumping result, returning 12033 1726867191.18525: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 [0affcac9-a3a5-74bb-502b-000000000652] 12033 1726867191.18527: sending task result for task 0affcac9-a3a5-74bb-502b-000000000652 12033 1726867191.18619: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000652 12033 1726867191.18621: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003583", "end": "2024-09-20 17:19:51.140408", "rc": 0, "start": "2024-09-20 17:19:51.136825" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::17d/128 scope global dynamic noprefixroute valid_lft 229sec preferred_lft 229sec inet6 2001:db8::7855:c7ff:fea0:6986/64 scope global dynamic noprefixroute valid_lft 1795sec preferred_lft 1795sec inet6 fe80::7855:c7ff:fea0:6986/64 scope link noprefixroute valid_lft forever preferred_lft forever 12033 1726867191.18702: no more pending results, returning what we have 12033 1726867191.18707: results queue empty 12033 1726867191.18708: checking for any_errors_fatal 12033 1726867191.18709: done checking for any_errors_fatal 12033 1726867191.18710: checking for max_fail_percentage 12033 1726867191.18712: done checking for max_fail_percentage 12033 1726867191.18713: checking to see if all hosts have failed and the running result is not ok 12033 1726867191.18713: done checking to see if all hosts have failed 12033 1726867191.18714: getting the remaining hosts for this loop 12033 1726867191.18716: done getting the remaining hosts for this loop 12033 1726867191.18719: getting the next task for host managed_node3 12033 1726867191.18728: done getting next task for host managed_node3 12033 1726867191.18731: ^ task is: TASK: Conditional asserts 12033 1726867191.18735: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867191.18739: getting variables 12033 1726867191.18741: in VariableManager get_vars() 12033 1726867191.18771: Calling all_inventory to load vars for managed_node3 12033 1726867191.18774: Calling groups_inventory to load vars for managed_node3 12033 1726867191.18778: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867191.18790: Calling all_plugins_play to load vars for managed_node3 12033 1726867191.18794: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867191.18797: Calling groups_plugins_play to load vars for managed_node3 12033 1726867191.19619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867191.20818: done with get_vars() 12033 1726867191.20833: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 17:19:51 -0400 (0:00:00.447) 0:00:30.324 ****** 12033 1726867191.20904: entering _queue_task() for managed_node3/include_tasks 12033 1726867191.21141: worker is 1 (out of 1 available) 12033 1726867191.21155: exiting _queue_task() for managed_node3/include_tasks 12033 1726867191.21167: done queuing things up, now waiting for results queue to drain 12033 1726867191.21169: waiting for pending results... 12033 1726867191.21365: running TaskExecutor() for managed_node3/TASK: Conditional asserts 12033 1726867191.21437: in run() - task 0affcac9-a3a5-74bb-502b-00000000008e 12033 1726867191.21449: variable 'ansible_search_path' from source: unknown 12033 1726867191.21452: variable 'ansible_search_path' from source: unknown 12033 1726867191.21694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867191.23559: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867191.23622: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867191.23661: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867191.23686: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867191.23709: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867191.23786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867191.23809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867191.23842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867191.23875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867191.23886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867191.24056: dumping result to json 12033 1726867191.24059: done dumping result, returning 12033 1726867191.24065: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [0affcac9-a3a5-74bb-502b-00000000008e] 12033 1726867191.24096: sending task result for task 0affcac9-a3a5-74bb-502b-00000000008e 12033 1726867191.24200: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000008e 12033 1726867191.24203: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "skipped_reason": "No items in the list" } 12033 1726867191.24259: no more pending results, returning what we have 12033 1726867191.24263: results queue empty 12033 1726867191.24264: checking for any_errors_fatal 12033 1726867191.24273: done checking for any_errors_fatal 12033 1726867191.24273: checking for max_fail_percentage 12033 1726867191.24275: done checking for max_fail_percentage 12033 1726867191.24276: checking to see if all hosts have failed and the running result is not ok 12033 1726867191.24337: done checking to see if all hosts have failed 12033 1726867191.24379: getting the remaining hosts for this loop 12033 1726867191.24381: done getting the remaining hosts for this loop 12033 1726867191.24384: getting the next task for host managed_node3 12033 1726867191.24392: done getting next task for host managed_node3 12033 1726867191.24394: ^ task is: TASK: Success in test '{{ lsr_description }}' 12033 1726867191.24398: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867191.24408: getting variables 12033 1726867191.24409: in VariableManager get_vars() 12033 1726867191.24441: Calling all_inventory to load vars for managed_node3 12033 1726867191.24444: Calling groups_inventory to load vars for managed_node3 12033 1726867191.24447: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867191.24455: Calling all_plugins_play to load vars for managed_node3 12033 1726867191.24457: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867191.24460: Calling groups_plugins_play to load vars for managed_node3 12033 1726867191.25275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867191.26388: done with get_vars() 12033 1726867191.26405: done getting variables 12033 1726867191.26447: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867191.26612: variable 'lsr_description' from source: include params TASK [Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.'] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 17:19:51 -0400 (0:00:00.057) 0:00:30.382 ****** 12033 1726867191.26655: entering _queue_task() for managed_node3/debug 12033 1726867191.27002: worker is 1 (out of 1 available) 12033 1726867191.27020: exiting _queue_task() for managed_node3/debug 12033 1726867191.27032: done queuing things up, now waiting for results queue to drain 12033 1726867191.27033: waiting for pending results... 12033 1726867191.27447: running TaskExecutor() for managed_node3/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' 12033 1726867191.27453: in run() - task 0affcac9-a3a5-74bb-502b-00000000008f 12033 1726867191.27456: variable 'ansible_search_path' from source: unknown 12033 1726867191.27459: variable 'ansible_search_path' from source: unknown 12033 1726867191.27462: calling self._execute() 12033 1726867191.27543: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867191.27547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867191.27555: variable 'omit' from source: magic vars 12033 1726867191.27934: variable 'ansible_distribution_major_version' from source: facts 12033 1726867191.27969: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867191.27973: variable 'omit' from source: magic vars 12033 1726867191.28016: variable 'omit' from source: magic vars 12033 1726867191.28115: variable 'lsr_description' from source: include params 12033 1726867191.28126: variable 'omit' from source: magic vars 12033 1726867191.28157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867191.28214: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867191.28250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867191.28253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867191.28274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867191.28323: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867191.28347: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867191.28350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867191.28453: Set connection var ansible_pipelining to False 12033 1726867191.28456: Set connection var ansible_shell_executable to /bin/sh 12033 1726867191.28459: Set connection var ansible_timeout to 10 12033 1726867191.28475: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867191.28479: Set connection var ansible_connection to ssh 12033 1726867191.28482: Set connection var ansible_shell_type to sh 12033 1726867191.28517: variable 'ansible_shell_executable' from source: unknown 12033 1726867191.28520: variable 'ansible_connection' from source: unknown 12033 1726867191.28523: variable 'ansible_module_compression' from source: unknown 12033 1726867191.28525: variable 'ansible_shell_type' from source: unknown 12033 1726867191.28530: variable 'ansible_shell_executable' from source: unknown 12033 1726867191.28533: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867191.28535: variable 'ansible_pipelining' from source: unknown 12033 1726867191.28538: variable 'ansible_timeout' from source: unknown 12033 1726867191.28540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867191.28654: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867191.28667: variable 'omit' from source: magic vars 12033 1726867191.28670: starting attempt loop 12033 1726867191.28673: running the handler 12033 1726867191.28713: handler run complete 12033 1726867191.28723: attempt loop complete, returning result 12033 1726867191.28726: _execute() done 12033 1726867191.28728: dumping result to json 12033 1726867191.28731: done dumping result, returning 12033 1726867191.28737: done running TaskExecutor() for managed_node3/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' [0affcac9-a3a5-74bb-502b-00000000008f] 12033 1726867191.28743: sending task result for task 0affcac9-a3a5-74bb-502b-00000000008f 12033 1726867191.28820: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000008f 12033 1726867191.28823: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: +++++ Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' +++++ 12033 1726867191.28871: no more pending results, returning what we have 12033 1726867191.28874: results queue empty 12033 1726867191.28875: checking for any_errors_fatal 12033 1726867191.28884: done checking for any_errors_fatal 12033 1726867191.28885: checking for max_fail_percentage 12033 1726867191.28886: done checking for max_fail_percentage 12033 1726867191.28887: checking to see if all hosts have failed and the running result is not ok 12033 1726867191.28888: done checking to see if all hosts have failed 12033 1726867191.28888: getting the remaining hosts for this loop 12033 1726867191.28890: done getting the remaining hosts for this loop 12033 1726867191.28893: getting the next task for host managed_node3 12033 1726867191.28901: done getting next task for host managed_node3 12033 1726867191.28904: ^ task is: TASK: Cleanup 12033 1726867191.28906: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867191.28910: getting variables 12033 1726867191.28911: in VariableManager get_vars() 12033 1726867191.28938: Calling all_inventory to load vars for managed_node3 12033 1726867191.28941: Calling groups_inventory to load vars for managed_node3 12033 1726867191.28944: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867191.28952: Calling all_plugins_play to load vars for managed_node3 12033 1726867191.28954: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867191.28957: Calling groups_plugins_play to load vars for managed_node3 12033 1726867191.35124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867191.36808: done with get_vars() 12033 1726867191.36833: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 17:19:51 -0400 (0:00:00.102) 0:00:30.485 ****** 12033 1726867191.36928: entering _queue_task() for managed_node3/include_tasks 12033 1726867191.37302: worker is 1 (out of 1 available) 12033 1726867191.37314: exiting _queue_task() for managed_node3/include_tasks 12033 1726867191.37327: done queuing things up, now waiting for results queue to drain 12033 1726867191.37330: waiting for pending results... 12033 1726867191.37892: running TaskExecutor() for managed_node3/TASK: Cleanup 12033 1726867191.37897: in run() - task 0affcac9-a3a5-74bb-502b-000000000093 12033 1726867191.37901: variable 'ansible_search_path' from source: unknown 12033 1726867191.37903: variable 'ansible_search_path' from source: unknown 12033 1726867191.37934: variable 'lsr_cleanup' from source: include params 12033 1726867191.38188: variable 'lsr_cleanup' from source: include params 12033 1726867191.38265: variable 'omit' from source: magic vars 12033 1726867191.38408: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867191.38435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867191.38452: variable 'omit' from source: magic vars 12033 1726867191.38709: variable 'ansible_distribution_major_version' from source: facts 12033 1726867191.38751: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867191.38755: variable 'item' from source: unknown 12033 1726867191.38808: variable 'item' from source: unknown 12033 1726867191.38846: variable 'item' from source: unknown 12033 1726867191.38968: variable 'item' from source: unknown 12033 1726867191.39288: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867191.39292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867191.39294: variable 'omit' from source: magic vars 12033 1726867191.39398: variable 'ansible_distribution_major_version' from source: facts 12033 1726867191.39402: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867191.39405: variable 'item' from source: unknown 12033 1726867191.39407: variable 'item' from source: unknown 12033 1726867191.39425: variable 'item' from source: unknown 12033 1726867191.39487: variable 'item' from source: unknown 12033 1726867191.39573: dumping result to json 12033 1726867191.39584: done dumping result, returning 12033 1726867191.39615: done running TaskExecutor() for managed_node3/TASK: Cleanup [0affcac9-a3a5-74bb-502b-000000000093] 12033 1726867191.39618: sending task result for task 0affcac9-a3a5-74bb-502b-000000000093 12033 1726867191.39746: no more pending results, returning what we have 12033 1726867191.39751: in VariableManager get_vars() 12033 1726867191.39794: Calling all_inventory to load vars for managed_node3 12033 1726867191.39797: Calling groups_inventory to load vars for managed_node3 12033 1726867191.39801: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867191.39816: Calling all_plugins_play to load vars for managed_node3 12033 1726867191.39820: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867191.39883: Calling groups_plugins_play to load vars for managed_node3 12033 1726867191.40449: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000093 12033 1726867191.40453: WORKER PROCESS EXITING 12033 1726867191.41491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867191.44309: done with get_vars() 12033 1726867191.44331: variable 'ansible_search_path' from source: unknown 12033 1726867191.44333: variable 'ansible_search_path' from source: unknown 12033 1726867191.44469: variable 'ansible_search_path' from source: unknown 12033 1726867191.44471: variable 'ansible_search_path' from source: unknown 12033 1726867191.44503: we have included files to process 12033 1726867191.44505: generating all_blocks data 12033 1726867191.44506: done generating all_blocks data 12033 1726867191.44564: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 12033 1726867191.44567: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 12033 1726867191.44570: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 12033 1726867191.45106: in VariableManager get_vars() 12033 1726867191.45124: done with get_vars() 12033 1726867191.45128: variable 'omit' from source: magic vars 12033 1726867191.45164: variable 'omit' from source: magic vars 12033 1726867191.45332: in VariableManager get_vars() 12033 1726867191.45343: done with get_vars() 12033 1726867191.45365: in VariableManager get_vars() 12033 1726867191.45492: done with get_vars() 12033 1726867191.45526: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12033 1726867191.45870: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12033 1726867191.46070: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12033 1726867191.46875: in VariableManager get_vars() 12033 1726867191.47060: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12033 1726867191.51403: done processing included file 12033 1726867191.51405: iterating over new_blocks loaded from include file 12033 1726867191.51407: in VariableManager get_vars() 12033 1726867191.51673: done with get_vars() 12033 1726867191.51675: filtering new block on tags 12033 1726867191.52366: done filtering new block on tags 12033 1726867191.52371: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml for managed_node3 => (item=tasks/cleanup_bond_profile+device.yml) 12033 1726867191.52418: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 12033 1726867191.52421: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 12033 1726867191.52425: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 12033 1726867191.53269: done processing included file 12033 1726867191.53271: iterating over new_blocks loaded from include file 12033 1726867191.53273: in VariableManager get_vars() 12033 1726867191.53407: done with get_vars() 12033 1726867191.53410: filtering new block on tags 12033 1726867191.53442: done filtering new block on tags 12033 1726867191.53445: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml for managed_node3 => (item=tasks/remove_test_interfaces_with_dhcp.yml) 12033 1726867191.53449: extending task lists for all hosts with included blocks 12033 1726867191.57959: done extending task lists 12033 1726867191.57961: done processing included files 12033 1726867191.57961: results queue empty 12033 1726867191.57962: checking for any_errors_fatal 12033 1726867191.57967: done checking for any_errors_fatal 12033 1726867191.57968: checking for max_fail_percentage 12033 1726867191.57969: done checking for max_fail_percentage 12033 1726867191.57970: checking to see if all hosts have failed and the running result is not ok 12033 1726867191.57971: done checking to see if all hosts have failed 12033 1726867191.57971: getting the remaining hosts for this loop 12033 1726867191.57973: done getting the remaining hosts for this loop 12033 1726867191.57976: getting the next task for host managed_node3 12033 1726867191.58115: done getting next task for host managed_node3 12033 1726867191.58124: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12033 1726867191.58127: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867191.58137: getting variables 12033 1726867191.58138: in VariableManager get_vars() 12033 1726867191.58155: Calling all_inventory to load vars for managed_node3 12033 1726867191.58157: Calling groups_inventory to load vars for managed_node3 12033 1726867191.58159: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867191.58164: Calling all_plugins_play to load vars for managed_node3 12033 1726867191.58167: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867191.58169: Calling groups_plugins_play to load vars for managed_node3 12033 1726867191.60796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867191.64119: done with get_vars() 12033 1726867191.64147: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:19:51 -0400 (0:00:00.274) 0:00:30.759 ****** 12033 1726867191.64347: entering _queue_task() for managed_node3/include_tasks 12033 1726867191.65022: worker is 1 (out of 1 available) 12033 1726867191.65031: exiting _queue_task() for managed_node3/include_tasks 12033 1726867191.65042: done queuing things up, now waiting for results queue to drain 12033 1726867191.65043: waiting for pending results... 12033 1726867191.65491: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12033 1726867191.65820: in run() - task 0affcac9-a3a5-74bb-502b-000000000693 12033 1726867191.65833: variable 'ansible_search_path' from source: unknown 12033 1726867191.65837: variable 'ansible_search_path' from source: unknown 12033 1726867191.65872: calling self._execute() 12033 1726867191.65963: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867191.65966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867191.65982: variable 'omit' from source: magic vars 12033 1726867191.66602: variable 'ansible_distribution_major_version' from source: facts 12033 1726867191.66606: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867191.66608: _execute() done 12033 1726867191.66610: dumping result to json 12033 1726867191.66612: done dumping result, returning 12033 1726867191.66614: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-74bb-502b-000000000693] 12033 1726867191.66616: sending task result for task 0affcac9-a3a5-74bb-502b-000000000693 12033 1726867191.66745: no more pending results, returning what we have 12033 1726867191.66749: in VariableManager get_vars() 12033 1726867191.66785: Calling all_inventory to load vars for managed_node3 12033 1726867191.66788: Calling groups_inventory to load vars for managed_node3 12033 1726867191.66790: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867191.66799: Calling all_plugins_play to load vars for managed_node3 12033 1726867191.66802: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867191.66804: Calling groups_plugins_play to load vars for managed_node3 12033 1726867191.67367: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000693 12033 1726867191.67371: WORKER PROCESS EXITING 12033 1726867191.69701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867191.71444: done with get_vars() 12033 1726867191.71547: variable 'ansible_search_path' from source: unknown 12033 1726867191.71548: variable 'ansible_search_path' from source: unknown 12033 1726867191.71595: we have included files to process 12033 1726867191.71596: generating all_blocks data 12033 1726867191.71598: done generating all_blocks data 12033 1726867191.71599: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12033 1726867191.71600: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12033 1726867191.71602: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12033 1726867191.72683: done processing included file 12033 1726867191.72685: iterating over new_blocks loaded from include file 12033 1726867191.72696: in VariableManager get_vars() 12033 1726867191.72747: done with get_vars() 12033 1726867191.72756: filtering new block on tags 12033 1726867191.72988: done filtering new block on tags 12033 1726867191.72991: in VariableManager get_vars() 12033 1726867191.73017: done with get_vars() 12033 1726867191.73019: filtering new block on tags 12033 1726867191.73069: done filtering new block on tags 12033 1726867191.73071: in VariableManager get_vars() 12033 1726867191.73124: done with get_vars() 12033 1726867191.73126: filtering new block on tags 12033 1726867191.73169: done filtering new block on tags 12033 1726867191.73171: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 12033 1726867191.73176: extending task lists for all hosts with included blocks 12033 1726867191.75179: done extending task lists 12033 1726867191.75181: done processing included files 12033 1726867191.75182: results queue empty 12033 1726867191.75183: checking for any_errors_fatal 12033 1726867191.75187: done checking for any_errors_fatal 12033 1726867191.75188: checking for max_fail_percentage 12033 1726867191.75189: done checking for max_fail_percentage 12033 1726867191.75190: checking to see if all hosts have failed and the running result is not ok 12033 1726867191.75190: done checking to see if all hosts have failed 12033 1726867191.75191: getting the remaining hosts for this loop 12033 1726867191.75192: done getting the remaining hosts for this loop 12033 1726867191.75195: getting the next task for host managed_node3 12033 1726867191.75200: done getting next task for host managed_node3 12033 1726867191.75203: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12033 1726867191.75208: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867191.75218: getting variables 12033 1726867191.75220: in VariableManager get_vars() 12033 1726867191.75234: Calling all_inventory to load vars for managed_node3 12033 1726867191.75238: Calling groups_inventory to load vars for managed_node3 12033 1726867191.75240: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867191.75246: Calling all_plugins_play to load vars for managed_node3 12033 1726867191.75249: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867191.75252: Calling groups_plugins_play to load vars for managed_node3 12033 1726867191.76538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867191.78286: done with get_vars() 12033 1726867191.78311: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:19:51 -0400 (0:00:00.140) 0:00:30.899 ****** 12033 1726867191.78383: entering _queue_task() for managed_node3/setup 12033 1726867191.78752: worker is 1 (out of 1 available) 12033 1726867191.78764: exiting _queue_task() for managed_node3/setup 12033 1726867191.78781: done queuing things up, now waiting for results queue to drain 12033 1726867191.78782: waiting for pending results... 12033 1726867191.79095: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12033 1726867191.79264: in run() - task 0affcac9-a3a5-74bb-502b-0000000007c9 12033 1726867191.79383: variable 'ansible_search_path' from source: unknown 12033 1726867191.79389: variable 'ansible_search_path' from source: unknown 12033 1726867191.79393: calling self._execute() 12033 1726867191.79440: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867191.79451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867191.79464: variable 'omit' from source: magic vars 12033 1726867191.79855: variable 'ansible_distribution_major_version' from source: facts 12033 1726867191.79949: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867191.80105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867191.82306: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867191.82389: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867191.82433: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867191.82484: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867191.82516: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867191.82611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867191.82646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867191.82689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867191.82736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867191.82783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867191.82825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867191.82854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867191.82886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867191.82982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867191.82985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867191.83135: variable '__network_required_facts' from source: role '' defaults 12033 1726867191.83150: variable 'ansible_facts' from source: unknown 12033 1726867191.83916: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12033 1726867191.83934: when evaluation is False, skipping this task 12033 1726867191.83942: _execute() done 12033 1726867191.83948: dumping result to json 12033 1726867191.83953: done dumping result, returning 12033 1726867191.83963: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-74bb-502b-0000000007c9] 12033 1726867191.84040: sending task result for task 0affcac9-a3a5-74bb-502b-0000000007c9 12033 1726867191.84104: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000007c9 12033 1726867191.84107: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867191.84185: no more pending results, returning what we have 12033 1726867191.84189: results queue empty 12033 1726867191.84190: checking for any_errors_fatal 12033 1726867191.84191: done checking for any_errors_fatal 12033 1726867191.84192: checking for max_fail_percentage 12033 1726867191.84194: done checking for max_fail_percentage 12033 1726867191.84195: checking to see if all hosts have failed and the running result is not ok 12033 1726867191.84195: done checking to see if all hosts have failed 12033 1726867191.84196: getting the remaining hosts for this loop 12033 1726867191.84198: done getting the remaining hosts for this loop 12033 1726867191.84201: getting the next task for host managed_node3 12033 1726867191.84212: done getting next task for host managed_node3 12033 1726867191.84215: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12033 1726867191.84221: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867191.84237: getting variables 12033 1726867191.84238: in VariableManager get_vars() 12033 1726867191.84281: Calling all_inventory to load vars for managed_node3 12033 1726867191.84284: Calling groups_inventory to load vars for managed_node3 12033 1726867191.84287: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867191.84298: Calling all_plugins_play to load vars for managed_node3 12033 1726867191.84301: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867191.84309: Calling groups_plugins_play to load vars for managed_node3 12033 1726867191.85948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867191.87611: done with get_vars() 12033 1726867191.87640: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:19:51 -0400 (0:00:00.093) 0:00:30.993 ****** 12033 1726867191.87750: entering _queue_task() for managed_node3/stat 12033 1726867191.88063: worker is 1 (out of 1 available) 12033 1726867191.88192: exiting _queue_task() for managed_node3/stat 12033 1726867191.88204: done queuing things up, now waiting for results queue to drain 12033 1726867191.88205: waiting for pending results... 12033 1726867191.88444: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 12033 1726867191.88622: in run() - task 0affcac9-a3a5-74bb-502b-0000000007cb 12033 1726867191.88626: variable 'ansible_search_path' from source: unknown 12033 1726867191.88630: variable 'ansible_search_path' from source: unknown 12033 1726867191.88660: calling self._execute() 12033 1726867191.88767: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867191.88841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867191.88846: variable 'omit' from source: magic vars 12033 1726867191.89206: variable 'ansible_distribution_major_version' from source: facts 12033 1726867191.89224: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867191.89406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867191.89682: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867191.89739: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867191.89772: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867191.89816: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867191.89928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867191.89933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867191.89964: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867191.89997: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867191.90087: variable '__network_is_ostree' from source: set_fact 12033 1726867191.90144: Evaluated conditional (not __network_is_ostree is defined): False 12033 1726867191.90148: when evaluation is False, skipping this task 12033 1726867191.90151: _execute() done 12033 1726867191.90153: dumping result to json 12033 1726867191.90156: done dumping result, returning 12033 1726867191.90158: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-74bb-502b-0000000007cb] 12033 1726867191.90160: sending task result for task 0affcac9-a3a5-74bb-502b-0000000007cb skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12033 1726867191.90411: no more pending results, returning what we have 12033 1726867191.90414: results queue empty 12033 1726867191.90416: checking for any_errors_fatal 12033 1726867191.90425: done checking for any_errors_fatal 12033 1726867191.90426: checking for max_fail_percentage 12033 1726867191.90428: done checking for max_fail_percentage 12033 1726867191.90429: checking to see if all hosts have failed and the running result is not ok 12033 1726867191.90429: done checking to see if all hosts have failed 12033 1726867191.90430: getting the remaining hosts for this loop 12033 1726867191.90432: done getting the remaining hosts for this loop 12033 1726867191.90436: getting the next task for host managed_node3 12033 1726867191.90445: done getting next task for host managed_node3 12033 1726867191.90449: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12033 1726867191.90456: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867191.90475: getting variables 12033 1726867191.90476: in VariableManager get_vars() 12033 1726867191.90515: Calling all_inventory to load vars for managed_node3 12033 1726867191.90518: Calling groups_inventory to load vars for managed_node3 12033 1726867191.90521: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867191.90531: Calling all_plugins_play to load vars for managed_node3 12033 1726867191.90534: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867191.90537: Calling groups_plugins_play to load vars for managed_node3 12033 1726867191.91090: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000007cb 12033 1726867191.91093: WORKER PROCESS EXITING 12033 1726867191.92244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867191.93922: done with get_vars() 12033 1726867191.93945: done getting variables 12033 1726867191.94014: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:19:51 -0400 (0:00:00.063) 0:00:31.056 ****** 12033 1726867191.94055: entering _queue_task() for managed_node3/set_fact 12033 1726867191.94439: worker is 1 (out of 1 available) 12033 1726867191.94451: exiting _queue_task() for managed_node3/set_fact 12033 1726867191.94463: done queuing things up, now waiting for results queue to drain 12033 1726867191.94465: waiting for pending results... 12033 1726867191.94712: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12033 1726867191.94928: in run() - task 0affcac9-a3a5-74bb-502b-0000000007cc 12033 1726867191.94960: variable 'ansible_search_path' from source: unknown 12033 1726867191.94969: variable 'ansible_search_path' from source: unknown 12033 1726867191.95010: calling self._execute() 12033 1726867191.95108: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867191.95122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867191.95136: variable 'omit' from source: magic vars 12033 1726867191.95529: variable 'ansible_distribution_major_version' from source: facts 12033 1726867191.95546: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867191.95724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867191.96013: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867191.96071: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867191.96107: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867191.96155: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867191.96249: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867191.96482: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867191.96487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867191.96490: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867191.96493: variable '__network_is_ostree' from source: set_fact 12033 1726867191.96495: Evaluated conditional (not __network_is_ostree is defined): False 12033 1726867191.96497: when evaluation is False, skipping this task 12033 1726867191.96500: _execute() done 12033 1726867191.96502: dumping result to json 12033 1726867191.96504: done dumping result, returning 12033 1726867191.96507: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-74bb-502b-0000000007cc] 12033 1726867191.96509: sending task result for task 0affcac9-a3a5-74bb-502b-0000000007cc 12033 1726867191.96574: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000007cc 12033 1726867191.96579: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12033 1726867191.96631: no more pending results, returning what we have 12033 1726867191.96636: results queue empty 12033 1726867191.96637: checking for any_errors_fatal 12033 1726867191.96645: done checking for any_errors_fatal 12033 1726867191.96646: checking for max_fail_percentage 12033 1726867191.96648: done checking for max_fail_percentage 12033 1726867191.96649: checking to see if all hosts have failed and the running result is not ok 12033 1726867191.96650: done checking to see if all hosts have failed 12033 1726867191.96651: getting the remaining hosts for this loop 12033 1726867191.96653: done getting the remaining hosts for this loop 12033 1726867191.96656: getting the next task for host managed_node3 12033 1726867191.96667: done getting next task for host managed_node3 12033 1726867191.96671: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12033 1726867191.96680: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867191.96733: getting variables 12033 1726867191.96735: in VariableManager get_vars() 12033 1726867191.96796: Calling all_inventory to load vars for managed_node3 12033 1726867191.96803: Calling groups_inventory to load vars for managed_node3 12033 1726867191.96806: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867191.96817: Calling all_plugins_play to load vars for managed_node3 12033 1726867191.96901: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867191.96908: Calling groups_plugins_play to load vars for managed_node3 12033 1726867191.98441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867192.00570: done with get_vars() 12033 1726867192.00660: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:19:52 -0400 (0:00:00.067) 0:00:31.123 ****** 12033 1726867192.00786: entering _queue_task() for managed_node3/service_facts 12033 1726867192.01212: worker is 1 (out of 1 available) 12033 1726867192.01223: exiting _queue_task() for managed_node3/service_facts 12033 1726867192.01234: done queuing things up, now waiting for results queue to drain 12033 1726867192.01237: waiting for pending results... 12033 1726867192.01595: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 12033 1726867192.01633: in run() - task 0affcac9-a3a5-74bb-502b-0000000007ce 12033 1726867192.01654: variable 'ansible_search_path' from source: unknown 12033 1726867192.01664: variable 'ansible_search_path' from source: unknown 12033 1726867192.01707: calling self._execute() 12033 1726867192.01810: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867192.01823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867192.01847: variable 'omit' from source: magic vars 12033 1726867192.02245: variable 'ansible_distribution_major_version' from source: facts 12033 1726867192.02381: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867192.02386: variable 'omit' from source: magic vars 12033 1726867192.02388: variable 'omit' from source: magic vars 12033 1726867192.02416: variable 'omit' from source: magic vars 12033 1726867192.02697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867192.02702: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867192.02707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867192.02739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867192.02942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867192.02945: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867192.02947: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867192.02949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867192.03097: Set connection var ansible_pipelining to False 12033 1726867192.03110: Set connection var ansible_shell_executable to /bin/sh 12033 1726867192.03120: Set connection var ansible_timeout to 10 12033 1726867192.03169: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867192.03183: Set connection var ansible_connection to ssh 12033 1726867192.03194: Set connection var ansible_shell_type to sh 12033 1726867192.03220: variable 'ansible_shell_executable' from source: unknown 12033 1726867192.03274: variable 'ansible_connection' from source: unknown 12033 1726867192.03285: variable 'ansible_module_compression' from source: unknown 12033 1726867192.03291: variable 'ansible_shell_type' from source: unknown 12033 1726867192.03297: variable 'ansible_shell_executable' from source: unknown 12033 1726867192.03302: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867192.03308: variable 'ansible_pipelining' from source: unknown 12033 1726867192.03486: variable 'ansible_timeout' from source: unknown 12033 1726867192.03489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867192.03686: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867192.03828: variable 'omit' from source: magic vars 12033 1726867192.03839: starting attempt loop 12033 1726867192.03846: running the handler 12033 1726867192.03865: _low_level_execute_command(): starting 12033 1726867192.03881: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867192.05207: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867192.05419: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867192.05517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867192.05536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867192.05578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867192.05694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867192.07797: stdout chunk (state=3): >>>/root <<< 12033 1726867192.07802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867192.07805: stdout chunk (state=3): >>><<< 12033 1726867192.07807: stderr chunk (state=3): >>><<< 12033 1726867192.07810: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867192.07813: _low_level_execute_command(): starting 12033 1726867192.07815: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867192.0770032-13593-34538231992593 `" && echo ansible-tmp-1726867192.0770032-13593-34538231992593="` echo /root/.ansible/tmp/ansible-tmp-1726867192.0770032-13593-34538231992593 `" ) && sleep 0' 12033 1726867192.09019: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867192.09154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867192.09241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867192.11136: stdout chunk (state=3): >>>ansible-tmp-1726867192.0770032-13593-34538231992593=/root/.ansible/tmp/ansible-tmp-1726867192.0770032-13593-34538231992593 <<< 12033 1726867192.11248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867192.11322: stderr chunk (state=3): >>><<< 12033 1726867192.11334: stdout chunk (state=3): >>><<< 12033 1726867192.11452: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867192.0770032-13593-34538231992593=/root/.ansible/tmp/ansible-tmp-1726867192.0770032-13593-34538231992593 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867192.11497: variable 'ansible_module_compression' from source: unknown 12033 1726867192.11537: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 12033 1726867192.11694: variable 'ansible_facts' from source: unknown 12033 1726867192.11782: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867192.0770032-13593-34538231992593/AnsiballZ_service_facts.py 12033 1726867192.12295: Sending initial data 12033 1726867192.12298: Sent initial data (161 bytes) 12033 1726867192.13107: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867192.13110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867192.13199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867192.13205: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867192.13389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867192.13396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867192.13460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867192.13464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867192.13537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867192.15083: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867192.15116: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867192.15296: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpxloy3yqi /root/.ansible/tmp/ansible-tmp-1726867192.0770032-13593-34538231992593/AnsiballZ_service_facts.py <<< 12033 1726867192.15299: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867192.0770032-13593-34538231992593/AnsiballZ_service_facts.py" <<< 12033 1726867192.15350: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpxloy3yqi" to remote "/root/.ansible/tmp/ansible-tmp-1726867192.0770032-13593-34538231992593/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867192.0770032-13593-34538231992593/AnsiballZ_service_facts.py" <<< 12033 1726867192.16811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867192.16830: stderr chunk (state=3): >>><<< 12033 1726867192.16832: stdout chunk (state=3): >>><<< 12033 1726867192.16900: done transferring module to remote 12033 1726867192.16903: _low_level_execute_command(): starting 12033 1726867192.16906: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867192.0770032-13593-34538231992593/ /root/.ansible/tmp/ansible-tmp-1726867192.0770032-13593-34538231992593/AnsiballZ_service_facts.py && sleep 0' 12033 1726867192.18102: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867192.18336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867192.18408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867192.18421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867192.18430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867192.18502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867192.20305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867192.20418: stderr chunk (state=3): >>><<< 12033 1726867192.20434: stdout chunk (state=3): >>><<< 12033 1726867192.20438: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867192.20441: _low_level_execute_command(): starting 12033 1726867192.20444: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867192.0770032-13593-34538231992593/AnsiballZ_service_facts.py && sleep 0' 12033 1726867192.21690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867192.21697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867192.21699: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867192.21702: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867192.21705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867192.21958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867192.22314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867192.22361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867193.73738: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 12033 1726867193.73810: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive",<<< 12033 1726867193.73846: stdout chunk (state=3): >>> "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12033 1726867193.75683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867193.75687: stdout chunk (state=3): >>><<< 12033 1726867193.75689: stderr chunk (state=3): >>><<< 12033 1726867193.75693: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867193.76907: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867192.0770032-13593-34538231992593/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867193.76923: _low_level_execute_command(): starting 12033 1726867193.76932: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867192.0770032-13593-34538231992593/ > /dev/null 2>&1 && sleep 0' 12033 1726867193.77464: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867193.77479: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867193.77491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867193.77507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867193.77520: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867193.77529: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867193.77539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867193.77596: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867193.77645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867193.77668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867193.77688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867193.77762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867193.79592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867193.79632: stderr chunk (state=3): >>><<< 12033 1726867193.79635: stdout chunk (state=3): >>><<< 12033 1726867193.79653: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867193.79659: handler run complete 12033 1726867193.79858: variable 'ansible_facts' from source: unknown 12033 1726867193.80041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867193.80588: variable 'ansible_facts' from source: unknown 12033 1726867193.80740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867193.80969: attempt loop complete, returning result 12033 1726867193.80972: _execute() done 12033 1726867193.80975: dumping result to json 12033 1726867193.81053: done dumping result, returning 12033 1726867193.81061: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-74bb-502b-0000000007ce] 12033 1726867193.81066: sending task result for task 0affcac9-a3a5-74bb-502b-0000000007ce 12033 1726867193.82221: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000007ce 12033 1726867193.82224: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867193.82327: no more pending results, returning what we have 12033 1726867193.82330: results queue empty 12033 1726867193.82331: checking for any_errors_fatal 12033 1726867193.82336: done checking for any_errors_fatal 12033 1726867193.82337: checking for max_fail_percentage 12033 1726867193.82339: done checking for max_fail_percentage 12033 1726867193.82340: checking to see if all hosts have failed and the running result is not ok 12033 1726867193.82341: done checking to see if all hosts have failed 12033 1726867193.82341: getting the remaining hosts for this loop 12033 1726867193.82343: done getting the remaining hosts for this loop 12033 1726867193.82346: getting the next task for host managed_node3 12033 1726867193.82352: done getting next task for host managed_node3 12033 1726867193.82355: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12033 1726867193.82362: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867193.82372: getting variables 12033 1726867193.82373: in VariableManager get_vars() 12033 1726867193.82405: Calling all_inventory to load vars for managed_node3 12033 1726867193.82408: Calling groups_inventory to load vars for managed_node3 12033 1726867193.82410: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867193.82422: Calling all_plugins_play to load vars for managed_node3 12033 1726867193.82425: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867193.82429: Calling groups_plugins_play to load vars for managed_node3 12033 1726867193.83939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867193.85638: done with get_vars() 12033 1726867193.85658: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:19:53 -0400 (0:00:01.849) 0:00:32.973 ****** 12033 1726867193.85758: entering _queue_task() for managed_node3/package_facts 12033 1726867193.86073: worker is 1 (out of 1 available) 12033 1726867193.86249: exiting _queue_task() for managed_node3/package_facts 12033 1726867193.86260: done queuing things up, now waiting for results queue to drain 12033 1726867193.86262: waiting for pending results... 12033 1726867193.86474: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 12033 1726867193.86607: in run() - task 0affcac9-a3a5-74bb-502b-0000000007cf 12033 1726867193.86629: variable 'ansible_search_path' from source: unknown 12033 1726867193.86637: variable 'ansible_search_path' from source: unknown 12033 1726867193.86687: calling self._execute() 12033 1726867193.86800: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867193.86817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867193.86883: variable 'omit' from source: magic vars 12033 1726867193.87248: variable 'ansible_distribution_major_version' from source: facts 12033 1726867193.87266: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867193.87280: variable 'omit' from source: magic vars 12033 1726867193.87382: variable 'omit' from source: magic vars 12033 1726867193.87421: variable 'omit' from source: magic vars 12033 1726867193.87474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867193.87519: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867193.87563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867193.87581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867193.87603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867193.87662: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867193.87666: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867193.87669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867193.87783: Set connection var ansible_pipelining to False 12033 1726867193.87881: Set connection var ansible_shell_executable to /bin/sh 12033 1726867193.87884: Set connection var ansible_timeout to 10 12033 1726867193.87888: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867193.87890: Set connection var ansible_connection to ssh 12033 1726867193.87894: Set connection var ansible_shell_type to sh 12033 1726867193.87896: variable 'ansible_shell_executable' from source: unknown 12033 1726867193.87898: variable 'ansible_connection' from source: unknown 12033 1726867193.87900: variable 'ansible_module_compression' from source: unknown 12033 1726867193.87902: variable 'ansible_shell_type' from source: unknown 12033 1726867193.87903: variable 'ansible_shell_executable' from source: unknown 12033 1726867193.87905: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867193.87907: variable 'ansible_pipelining' from source: unknown 12033 1726867193.87909: variable 'ansible_timeout' from source: unknown 12033 1726867193.87912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867193.88080: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867193.88135: variable 'omit' from source: magic vars 12033 1726867193.88138: starting attempt loop 12033 1726867193.88140: running the handler 12033 1726867193.88143: _low_level_execute_command(): starting 12033 1726867193.88154: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867193.88988: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867193.89050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867193.89082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867193.89115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867193.89210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867193.90879: stdout chunk (state=3): >>>/root <<< 12033 1726867193.91051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867193.91055: stdout chunk (state=3): >>><<< 12033 1726867193.91058: stderr chunk (state=3): >>><<< 12033 1726867193.91179: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867193.91183: _low_level_execute_command(): starting 12033 1726867193.91186: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867193.9108968-13702-42106263555340 `" && echo ansible-tmp-1726867193.9108968-13702-42106263555340="` echo /root/.ansible/tmp/ansible-tmp-1726867193.9108968-13702-42106263555340 `" ) && sleep 0' 12033 1726867193.91765: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867193.91784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867193.91803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867193.91835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867193.91947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867193.91997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867193.92043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867193.93922: stdout chunk (state=3): >>>ansible-tmp-1726867193.9108968-13702-42106263555340=/root/.ansible/tmp/ansible-tmp-1726867193.9108968-13702-42106263555340 <<< 12033 1726867193.94098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867193.94102: stdout chunk (state=3): >>><<< 12033 1726867193.94105: stderr chunk (state=3): >>><<< 12033 1726867193.94119: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867193.9108968-13702-42106263555340=/root/.ansible/tmp/ansible-tmp-1726867193.9108968-13702-42106263555340 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867193.94274: variable 'ansible_module_compression' from source: unknown 12033 1726867193.94279: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 12033 1726867193.94281: variable 'ansible_facts' from source: unknown 12033 1726867193.94538: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867193.9108968-13702-42106263555340/AnsiballZ_package_facts.py 12033 1726867193.94757: Sending initial data 12033 1726867193.94761: Sent initial data (161 bytes) 12033 1726867193.95934: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867193.95937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867193.95940: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867193.95943: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867193.95946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867193.96083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867193.96225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867193.96229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867193.96269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867193.97793: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12033 1726867193.97907: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867193.97959: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867193.98007: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmp71rsoz38 /root/.ansible/tmp/ansible-tmp-1726867193.9108968-13702-42106263555340/AnsiballZ_package_facts.py <<< 12033 1726867193.98011: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867193.9108968-13702-42106263555340/AnsiballZ_package_facts.py" <<< 12033 1726867193.98083: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmp71rsoz38" to remote "/root/.ansible/tmp/ansible-tmp-1726867193.9108968-13702-42106263555340/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867193.9108968-13702-42106263555340/AnsiballZ_package_facts.py" <<< 12033 1726867194.00258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867194.00261: stdout chunk (state=3): >>><<< 12033 1726867194.00264: stderr chunk (state=3): >>><<< 12033 1726867194.00297: done transferring module to remote 12033 1726867194.00309: _low_level_execute_command(): starting 12033 1726867194.00314: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867193.9108968-13702-42106263555340/ /root/.ansible/tmp/ansible-tmp-1726867193.9108968-13702-42106263555340/AnsiballZ_package_facts.py && sleep 0' 12033 1726867194.00882: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867194.00916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867194.00922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867194.00926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867194.00928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867194.00931: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867194.00984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867194.00988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867194.00990: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867194.00995: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867194.00997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867194.00999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867194.01002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867194.01003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867194.01056: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867194.01095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867194.01144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867194.03011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867194.03032: stderr chunk (state=3): >>><<< 12033 1726867194.03035: stdout chunk (state=3): >>><<< 12033 1726867194.03047: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867194.03055: _low_level_execute_command(): starting 12033 1726867194.03064: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867193.9108968-13702-42106263555340/AnsiballZ_package_facts.py && sleep 0' 12033 1726867194.03754: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867194.03768: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867194.03787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867194.03807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867194.03825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867194.03837: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867194.03851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867194.03870: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867194.03885: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867194.03967: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867194.03987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867194.04005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867194.04096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867194.48239: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 12033 1726867194.48368: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 12033 1726867194.48447: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 12033 1726867194.48476: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12033 1726867194.50351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867194.50355: stdout chunk (state=3): >>><<< 12033 1726867194.50358: stderr chunk (state=3): >>><<< 12033 1726867194.50374: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867194.52884: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867193.9108968-13702-42106263555340/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867194.52888: _low_level_execute_command(): starting 12033 1726867194.52893: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867193.9108968-13702-42106263555340/ > /dev/null 2>&1 && sleep 0' 12033 1726867194.53554: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867194.53570: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867194.53593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867194.53619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867194.53699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867194.53742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867194.53760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867194.53784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867194.53863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867194.55982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867194.55985: stdout chunk (state=3): >>><<< 12033 1726867194.55987: stderr chunk (state=3): >>><<< 12033 1726867194.55990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867194.55994: handler run complete 12033 1726867194.57262: variable 'ansible_facts' from source: unknown 12033 1726867194.57843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867194.59825: variable 'ansible_facts' from source: unknown 12033 1726867194.60267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867194.60988: attempt loop complete, returning result 12033 1726867194.61008: _execute() done 12033 1726867194.61015: dumping result to json 12033 1726867194.61235: done dumping result, returning 12033 1726867194.61252: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-74bb-502b-0000000007cf] 12033 1726867194.61275: sending task result for task 0affcac9-a3a5-74bb-502b-0000000007cf 12033 1726867194.63723: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000007cf 12033 1726867194.63727: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867194.63885: no more pending results, returning what we have 12033 1726867194.63888: results queue empty 12033 1726867194.63889: checking for any_errors_fatal 12033 1726867194.63900: done checking for any_errors_fatal 12033 1726867194.63901: checking for max_fail_percentage 12033 1726867194.63903: done checking for max_fail_percentage 12033 1726867194.63904: checking to see if all hosts have failed and the running result is not ok 12033 1726867194.63904: done checking to see if all hosts have failed 12033 1726867194.63905: getting the remaining hosts for this loop 12033 1726867194.63906: done getting the remaining hosts for this loop 12033 1726867194.63909: getting the next task for host managed_node3 12033 1726867194.63939: done getting next task for host managed_node3 12033 1726867194.63943: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12033 1726867194.63948: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867194.63959: getting variables 12033 1726867194.63961: in VariableManager get_vars() 12033 1726867194.63995: Calling all_inventory to load vars for managed_node3 12033 1726867194.63998: Calling groups_inventory to load vars for managed_node3 12033 1726867194.64000: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867194.64013: Calling all_plugins_play to load vars for managed_node3 12033 1726867194.64016: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867194.64020: Calling groups_plugins_play to load vars for managed_node3 12033 1726867194.65243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867194.66863: done with get_vars() 12033 1726867194.66893: done getting variables 12033 1726867194.66957: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:19:54 -0400 (0:00:00.812) 0:00:33.785 ****** 12033 1726867194.66999: entering _queue_task() for managed_node3/debug 12033 1726867194.67339: worker is 1 (out of 1 available) 12033 1726867194.67351: exiting _queue_task() for managed_node3/debug 12033 1726867194.67363: done queuing things up, now waiting for results queue to drain 12033 1726867194.67365: waiting for pending results... 12033 1726867194.67741: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 12033 1726867194.67861: in run() - task 0affcac9-a3a5-74bb-502b-000000000694 12033 1726867194.67889: variable 'ansible_search_path' from source: unknown 12033 1726867194.67946: variable 'ansible_search_path' from source: unknown 12033 1726867194.67952: calling self._execute() 12033 1726867194.68056: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867194.68071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867194.68094: variable 'omit' from source: magic vars 12033 1726867194.68512: variable 'ansible_distribution_major_version' from source: facts 12033 1726867194.68530: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867194.68583: variable 'omit' from source: magic vars 12033 1726867194.68633: variable 'omit' from source: magic vars 12033 1726867194.68747: variable 'network_provider' from source: set_fact 12033 1726867194.68771: variable 'omit' from source: magic vars 12033 1726867194.68828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867194.68872: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867194.68927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867194.68931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867194.68952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867194.68993: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867194.69037: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867194.69040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867194.69127: Set connection var ansible_pipelining to False 12033 1726867194.69148: Set connection var ansible_shell_executable to /bin/sh 12033 1726867194.69166: Set connection var ansible_timeout to 10 12033 1726867194.69254: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867194.69257: Set connection var ansible_connection to ssh 12033 1726867194.69260: Set connection var ansible_shell_type to sh 12033 1726867194.69264: variable 'ansible_shell_executable' from source: unknown 12033 1726867194.69267: variable 'ansible_connection' from source: unknown 12033 1726867194.69269: variable 'ansible_module_compression' from source: unknown 12033 1726867194.69271: variable 'ansible_shell_type' from source: unknown 12033 1726867194.69273: variable 'ansible_shell_executable' from source: unknown 12033 1726867194.69275: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867194.69276: variable 'ansible_pipelining' from source: unknown 12033 1726867194.69280: variable 'ansible_timeout' from source: unknown 12033 1726867194.69283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867194.69420: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867194.69438: variable 'omit' from source: magic vars 12033 1726867194.69447: starting attempt loop 12033 1726867194.69471: running the handler 12033 1726867194.69519: handler run complete 12033 1726867194.69539: attempt loop complete, returning result 12033 1726867194.69582: _execute() done 12033 1726867194.69585: dumping result to json 12033 1726867194.69587: done dumping result, returning 12033 1726867194.69589: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-74bb-502b-000000000694] 12033 1726867194.69594: sending task result for task 0affcac9-a3a5-74bb-502b-000000000694 ok: [managed_node3] => {} MSG: Using network provider: nm 12033 1726867194.69858: no more pending results, returning what we have 12033 1726867194.69862: results queue empty 12033 1726867194.69863: checking for any_errors_fatal 12033 1726867194.69873: done checking for any_errors_fatal 12033 1726867194.69874: checking for max_fail_percentage 12033 1726867194.69876: done checking for max_fail_percentage 12033 1726867194.69883: checking to see if all hosts have failed and the running result is not ok 12033 1726867194.69884: done checking to see if all hosts have failed 12033 1726867194.69885: getting the remaining hosts for this loop 12033 1726867194.69887: done getting the remaining hosts for this loop 12033 1726867194.69890: getting the next task for host managed_node3 12033 1726867194.69900: done getting next task for host managed_node3 12033 1726867194.69983: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12033 1726867194.69987: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867194.70002: getting variables 12033 1726867194.70003: in VariableManager get_vars() 12033 1726867194.70041: Calling all_inventory to load vars for managed_node3 12033 1726867194.70043: Calling groups_inventory to load vars for managed_node3 12033 1726867194.70046: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867194.70128: Calling all_plugins_play to load vars for managed_node3 12033 1726867194.70131: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867194.70140: Calling groups_plugins_play to load vars for managed_node3 12033 1726867194.70789: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000694 12033 1726867194.70795: WORKER PROCESS EXITING 12033 1726867194.71676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867194.73370: done with get_vars() 12033 1726867194.73398: done getting variables 12033 1726867194.73463: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:19:54 -0400 (0:00:00.065) 0:00:33.850 ****** 12033 1726867194.73511: entering _queue_task() for managed_node3/fail 12033 1726867194.74149: worker is 1 (out of 1 available) 12033 1726867194.74164: exiting _queue_task() for managed_node3/fail 12033 1726867194.74179: done queuing things up, now waiting for results queue to drain 12033 1726867194.74181: waiting for pending results... 12033 1726867194.74654: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12033 1726867194.74963: in run() - task 0affcac9-a3a5-74bb-502b-000000000695 12033 1726867194.74989: variable 'ansible_search_path' from source: unknown 12033 1726867194.75084: variable 'ansible_search_path' from source: unknown 12033 1726867194.75088: calling self._execute() 12033 1726867194.75140: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867194.75153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867194.75167: variable 'omit' from source: magic vars 12033 1726867194.75541: variable 'ansible_distribution_major_version' from source: facts 12033 1726867194.75561: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867194.75690: variable 'network_state' from source: role '' defaults 12033 1726867194.75705: Evaluated conditional (network_state != {}): False 12033 1726867194.75712: when evaluation is False, skipping this task 12033 1726867194.75718: _execute() done 12033 1726867194.75724: dumping result to json 12033 1726867194.75730: done dumping result, returning 12033 1726867194.75739: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-74bb-502b-000000000695] 12033 1726867194.75746: sending task result for task 0affcac9-a3a5-74bb-502b-000000000695 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867194.75896: no more pending results, returning what we have 12033 1726867194.75900: results queue empty 12033 1726867194.75901: checking for any_errors_fatal 12033 1726867194.75907: done checking for any_errors_fatal 12033 1726867194.75907: checking for max_fail_percentage 12033 1726867194.75909: done checking for max_fail_percentage 12033 1726867194.75910: checking to see if all hosts have failed and the running result is not ok 12033 1726867194.75910: done checking to see if all hosts have failed 12033 1726867194.75911: getting the remaining hosts for this loop 12033 1726867194.75913: done getting the remaining hosts for this loop 12033 1726867194.75916: getting the next task for host managed_node3 12033 1726867194.75924: done getting next task for host managed_node3 12033 1726867194.75983: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12033 1726867194.75989: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867194.76011: getting variables 12033 1726867194.76013: in VariableManager get_vars() 12033 1726867194.76139: Calling all_inventory to load vars for managed_node3 12033 1726867194.76142: Calling groups_inventory to load vars for managed_node3 12033 1726867194.76144: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867194.76150: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000695 12033 1726867194.76152: WORKER PROCESS EXITING 12033 1726867194.76160: Calling all_plugins_play to load vars for managed_node3 12033 1726867194.76163: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867194.76166: Calling groups_plugins_play to load vars for managed_node3 12033 1726867194.77399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867194.79191: done with get_vars() 12033 1726867194.79213: done getting variables 12033 1726867194.79271: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:19:54 -0400 (0:00:00.057) 0:00:33.908 ****** 12033 1726867194.79308: entering _queue_task() for managed_node3/fail 12033 1726867194.79622: worker is 1 (out of 1 available) 12033 1726867194.79634: exiting _queue_task() for managed_node3/fail 12033 1726867194.79647: done queuing things up, now waiting for results queue to drain 12033 1726867194.79648: waiting for pending results... 12033 1726867194.79902: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12033 1726867194.80059: in run() - task 0affcac9-a3a5-74bb-502b-000000000696 12033 1726867194.80081: variable 'ansible_search_path' from source: unknown 12033 1726867194.80091: variable 'ansible_search_path' from source: unknown 12033 1726867194.80135: calling self._execute() 12033 1726867194.80321: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867194.80325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867194.80327: variable 'omit' from source: magic vars 12033 1726867194.80615: variable 'ansible_distribution_major_version' from source: facts 12033 1726867194.80633: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867194.80761: variable 'network_state' from source: role '' defaults 12033 1726867194.80775: Evaluated conditional (network_state != {}): False 12033 1726867194.80783: when evaluation is False, skipping this task 12033 1726867194.80790: _execute() done 12033 1726867194.80797: dumping result to json 12033 1726867194.80803: done dumping result, returning 12033 1726867194.80812: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-74bb-502b-000000000696] 12033 1726867194.80819: sending task result for task 0affcac9-a3a5-74bb-502b-000000000696 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867194.80957: no more pending results, returning what we have 12033 1726867194.80960: results queue empty 12033 1726867194.80961: checking for any_errors_fatal 12033 1726867194.80974: done checking for any_errors_fatal 12033 1726867194.80974: checking for max_fail_percentage 12033 1726867194.80976: done checking for max_fail_percentage 12033 1726867194.80980: checking to see if all hosts have failed and the running result is not ok 12033 1726867194.80981: done checking to see if all hosts have failed 12033 1726867194.80981: getting the remaining hosts for this loop 12033 1726867194.80983: done getting the remaining hosts for this loop 12033 1726867194.80987: getting the next task for host managed_node3 12033 1726867194.80995: done getting next task for host managed_node3 12033 1726867194.80999: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12033 1726867194.81003: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867194.81022: getting variables 12033 1726867194.81023: in VariableManager get_vars() 12033 1726867194.81057: Calling all_inventory to load vars for managed_node3 12033 1726867194.81059: Calling groups_inventory to load vars for managed_node3 12033 1726867194.81061: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867194.81072: Calling all_plugins_play to load vars for managed_node3 12033 1726867194.81076: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867194.81314: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000696 12033 1726867194.81317: WORKER PROCESS EXITING 12033 1726867194.81321: Calling groups_plugins_play to load vars for managed_node3 12033 1726867194.82634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867194.84132: done with get_vars() 12033 1726867194.84154: done getting variables 12033 1726867194.84212: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:19:54 -0400 (0:00:00.049) 0:00:33.958 ****** 12033 1726867194.84248: entering _queue_task() for managed_node3/fail 12033 1726867194.84524: worker is 1 (out of 1 available) 12033 1726867194.84537: exiting _queue_task() for managed_node3/fail 12033 1726867194.84549: done queuing things up, now waiting for results queue to drain 12033 1726867194.84551: waiting for pending results... 12033 1726867194.84996: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12033 1726867194.85001: in run() - task 0affcac9-a3a5-74bb-502b-000000000697 12033 1726867194.85023: variable 'ansible_search_path' from source: unknown 12033 1726867194.85032: variable 'ansible_search_path' from source: unknown 12033 1726867194.85072: calling self._execute() 12033 1726867194.85168: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867194.85185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867194.85200: variable 'omit' from source: magic vars 12033 1726867194.85563: variable 'ansible_distribution_major_version' from source: facts 12033 1726867194.85582: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867194.85755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867194.87864: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867194.87944: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867194.87982: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867194.88025: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867194.88153: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867194.88156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867194.88181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867194.88208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867194.88246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867194.88267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867194.88351: variable 'ansible_distribution_major_version' from source: facts 12033 1726867194.88375: Evaluated conditional (ansible_distribution_major_version | int > 9): True 12033 1726867194.88492: variable 'ansible_distribution' from source: facts 12033 1726867194.88502: variable '__network_rh_distros' from source: role '' defaults 12033 1726867194.88515: Evaluated conditional (ansible_distribution in __network_rh_distros): True 12033 1726867194.88768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867194.88804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867194.88834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867194.88879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867194.88901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867194.88955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867194.88983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867194.89019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867194.89083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867194.89087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867194.89119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867194.89154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867194.89187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867194.89239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867194.89348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867194.89576: variable 'network_connections' from source: task vars 12033 1726867194.89595: variable 'port2_profile' from source: play vars 12033 1726867194.89658: variable 'port2_profile' from source: play vars 12033 1726867194.89681: variable 'port1_profile' from source: play vars 12033 1726867194.89743: variable 'port1_profile' from source: play vars 12033 1726867194.89756: variable 'controller_profile' from source: play vars 12033 1726867194.89823: variable 'controller_profile' from source: play vars 12033 1726867194.89837: variable 'network_state' from source: role '' defaults 12033 1726867194.89915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867194.90082: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867194.90127: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867194.90161: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867194.90196: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867194.90325: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867194.90329: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867194.90331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867194.90346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867194.90380: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 12033 1726867194.90390: when evaluation is False, skipping this task 12033 1726867194.90398: _execute() done 12033 1726867194.90405: dumping result to json 12033 1726867194.90413: done dumping result, returning 12033 1726867194.90425: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-74bb-502b-000000000697] 12033 1726867194.90440: sending task result for task 0affcac9-a3a5-74bb-502b-000000000697 skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 12033 1726867194.90589: no more pending results, returning what we have 12033 1726867194.90593: results queue empty 12033 1726867194.90594: checking for any_errors_fatal 12033 1726867194.90600: done checking for any_errors_fatal 12033 1726867194.90601: checking for max_fail_percentage 12033 1726867194.90603: done checking for max_fail_percentage 12033 1726867194.90604: checking to see if all hosts have failed and the running result is not ok 12033 1726867194.90605: done checking to see if all hosts have failed 12033 1726867194.90605: getting the remaining hosts for this loop 12033 1726867194.90607: done getting the remaining hosts for this loop 12033 1726867194.90611: getting the next task for host managed_node3 12033 1726867194.90619: done getting next task for host managed_node3 12033 1726867194.90623: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12033 1726867194.90628: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867194.90646: getting variables 12033 1726867194.90647: in VariableManager get_vars() 12033 1726867194.90688: Calling all_inventory to load vars for managed_node3 12033 1726867194.90691: Calling groups_inventory to load vars for managed_node3 12033 1726867194.90694: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867194.90704: Calling all_plugins_play to load vars for managed_node3 12033 1726867194.90708: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867194.90710: Calling groups_plugins_play to load vars for managed_node3 12033 1726867194.91590: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000697 12033 1726867194.91594: WORKER PROCESS EXITING 12033 1726867194.92251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867194.93880: done with get_vars() 12033 1726867194.93901: done getting variables 12033 1726867194.93957: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:19:54 -0400 (0:00:00.097) 0:00:34.055 ****** 12033 1726867194.93993: entering _queue_task() for managed_node3/dnf 12033 1726867194.94274: worker is 1 (out of 1 available) 12033 1726867194.94291: exiting _queue_task() for managed_node3/dnf 12033 1726867194.94303: done queuing things up, now waiting for results queue to drain 12033 1726867194.94305: waiting for pending results... 12033 1726867194.94603: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12033 1726867194.94769: in run() - task 0affcac9-a3a5-74bb-502b-000000000698 12033 1726867194.94794: variable 'ansible_search_path' from source: unknown 12033 1726867194.94803: variable 'ansible_search_path' from source: unknown 12033 1726867194.94847: calling self._execute() 12033 1726867194.94945: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867194.94958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867194.94971: variable 'omit' from source: magic vars 12033 1726867194.95336: variable 'ansible_distribution_major_version' from source: facts 12033 1726867194.95353: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867194.95556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867194.97763: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867194.97836: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867194.97881: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867194.97921: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867194.97951: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867194.98282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867194.98285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867194.98288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867194.98291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867194.98293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867194.98295: variable 'ansible_distribution' from source: facts 12033 1726867194.98297: variable 'ansible_distribution_major_version' from source: facts 12033 1726867194.98299: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12033 1726867194.98380: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867194.98499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867194.98533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867194.98563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867194.98609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867194.98633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867194.98679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867194.98705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867194.98728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867194.98771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867194.98790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867194.98832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867194.98865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867194.98895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867194.98936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867194.98956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867194.99113: variable 'network_connections' from source: task vars 12033 1726867194.99129: variable 'port2_profile' from source: play vars 12033 1726867194.99198: variable 'port2_profile' from source: play vars 12033 1726867194.99213: variable 'port1_profile' from source: play vars 12033 1726867194.99274: variable 'port1_profile' from source: play vars 12033 1726867194.99294: variable 'controller_profile' from source: play vars 12033 1726867194.99355: variable 'controller_profile' from source: play vars 12033 1726867194.99434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867194.99591: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867194.99641: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867194.99675: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867194.99881: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867194.99885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867194.99895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867194.99898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867194.99900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867194.99903: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867195.00154: variable 'network_connections' from source: task vars 12033 1726867195.00164: variable 'port2_profile' from source: play vars 12033 1726867195.00230: variable 'port2_profile' from source: play vars 12033 1726867195.00248: variable 'port1_profile' from source: play vars 12033 1726867195.00311: variable 'port1_profile' from source: play vars 12033 1726867195.00346: variable 'controller_profile' from source: play vars 12033 1726867195.00392: variable 'controller_profile' from source: play vars 12033 1726867195.00421: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12033 1726867195.00454: when evaluation is False, skipping this task 12033 1726867195.00457: _execute() done 12033 1726867195.00460: dumping result to json 12033 1726867195.00462: done dumping result, returning 12033 1726867195.00464: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-74bb-502b-000000000698] 12033 1726867195.00466: sending task result for task 0affcac9-a3a5-74bb-502b-000000000698 12033 1726867195.00799: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000698 12033 1726867195.00802: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12033 1726867195.00847: no more pending results, returning what we have 12033 1726867195.00851: results queue empty 12033 1726867195.00852: checking for any_errors_fatal 12033 1726867195.00858: done checking for any_errors_fatal 12033 1726867195.00859: checking for max_fail_percentage 12033 1726867195.00860: done checking for max_fail_percentage 12033 1726867195.00861: checking to see if all hosts have failed and the running result is not ok 12033 1726867195.00862: done checking to see if all hosts have failed 12033 1726867195.00863: getting the remaining hosts for this loop 12033 1726867195.00864: done getting the remaining hosts for this loop 12033 1726867195.00868: getting the next task for host managed_node3 12033 1726867195.00875: done getting next task for host managed_node3 12033 1726867195.00881: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12033 1726867195.00886: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867195.00901: getting variables 12033 1726867195.00903: in VariableManager get_vars() 12033 1726867195.00938: Calling all_inventory to load vars for managed_node3 12033 1726867195.00941: Calling groups_inventory to load vars for managed_node3 12033 1726867195.00943: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867195.00952: Calling all_plugins_play to load vars for managed_node3 12033 1726867195.00955: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867195.00958: Calling groups_plugins_play to load vars for managed_node3 12033 1726867195.02333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867195.03817: done with get_vars() 12033 1726867195.03837: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12033 1726867195.03909: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:19:55 -0400 (0:00:00.099) 0:00:34.155 ****** 12033 1726867195.03944: entering _queue_task() for managed_node3/yum 12033 1726867195.04238: worker is 1 (out of 1 available) 12033 1726867195.04250: exiting _queue_task() for managed_node3/yum 12033 1726867195.04263: done queuing things up, now waiting for results queue to drain 12033 1726867195.04264: waiting for pending results... 12033 1726867195.04551: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12033 1726867195.04709: in run() - task 0affcac9-a3a5-74bb-502b-000000000699 12033 1726867195.04883: variable 'ansible_search_path' from source: unknown 12033 1726867195.04887: variable 'ansible_search_path' from source: unknown 12033 1726867195.04890: calling self._execute() 12033 1726867195.04893: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867195.04896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867195.04899: variable 'omit' from source: magic vars 12033 1726867195.05276: variable 'ansible_distribution_major_version' from source: facts 12033 1726867195.05297: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867195.05474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867195.07649: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867195.07720: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867195.07763: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867195.07806: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867195.07841: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867195.07924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.07963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.07997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.08042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.08069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.08162: variable 'ansible_distribution_major_version' from source: facts 12033 1726867195.08187: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12033 1726867195.08280: when evaluation is False, skipping this task 12033 1726867195.08284: _execute() done 12033 1726867195.08286: dumping result to json 12033 1726867195.08289: done dumping result, returning 12033 1726867195.08292: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-74bb-502b-000000000699] 12033 1726867195.08294: sending task result for task 0affcac9-a3a5-74bb-502b-000000000699 12033 1726867195.08366: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000699 12033 1726867195.08369: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12033 1726867195.08429: no more pending results, returning what we have 12033 1726867195.08433: results queue empty 12033 1726867195.08434: checking for any_errors_fatal 12033 1726867195.08442: done checking for any_errors_fatal 12033 1726867195.08443: checking for max_fail_percentage 12033 1726867195.08445: done checking for max_fail_percentage 12033 1726867195.08446: checking to see if all hosts have failed and the running result is not ok 12033 1726867195.08447: done checking to see if all hosts have failed 12033 1726867195.08448: getting the remaining hosts for this loop 12033 1726867195.08450: done getting the remaining hosts for this loop 12033 1726867195.08453: getting the next task for host managed_node3 12033 1726867195.08462: done getting next task for host managed_node3 12033 1726867195.08465: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12033 1726867195.08470: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867195.08490: getting variables 12033 1726867195.08491: in VariableManager get_vars() 12033 1726867195.08529: Calling all_inventory to load vars for managed_node3 12033 1726867195.08532: Calling groups_inventory to load vars for managed_node3 12033 1726867195.08535: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867195.08545: Calling all_plugins_play to load vars for managed_node3 12033 1726867195.08548: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867195.08552: Calling groups_plugins_play to load vars for managed_node3 12033 1726867195.10226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867195.15807: done with get_vars() 12033 1726867195.15832: done getting variables 12033 1726867195.15883: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:19:55 -0400 (0:00:00.119) 0:00:34.274 ****** 12033 1726867195.15916: entering _queue_task() for managed_node3/fail 12033 1726867195.16255: worker is 1 (out of 1 available) 12033 1726867195.16268: exiting _queue_task() for managed_node3/fail 12033 1726867195.16481: done queuing things up, now waiting for results queue to drain 12033 1726867195.16484: waiting for pending results... 12033 1726867195.16614: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12033 1726867195.16819: in run() - task 0affcac9-a3a5-74bb-502b-00000000069a 12033 1726867195.16823: variable 'ansible_search_path' from source: unknown 12033 1726867195.16828: variable 'ansible_search_path' from source: unknown 12033 1726867195.16832: calling self._execute() 12033 1726867195.16935: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867195.16948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867195.16962: variable 'omit' from source: magic vars 12033 1726867195.17349: variable 'ansible_distribution_major_version' from source: facts 12033 1726867195.17372: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867195.17579: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867195.17689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867195.19745: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867195.19821: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867195.19867: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867195.19907: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867195.19936: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867195.20020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.20054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.20091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.20134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.20153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.20207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.20285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.20289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.20311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.20331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.20375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.20408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.20434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.20473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.20492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.20719: variable 'network_connections' from source: task vars 12033 1726867195.20723: variable 'port2_profile' from source: play vars 12033 1726867195.20757: variable 'port2_profile' from source: play vars 12033 1726867195.20775: variable 'port1_profile' from source: play vars 12033 1726867195.20846: variable 'port1_profile' from source: play vars 12033 1726867195.20861: variable 'controller_profile' from source: play vars 12033 1726867195.20924: variable 'controller_profile' from source: play vars 12033 1726867195.21007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867195.21199: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867195.21263: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867195.21284: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867195.21320: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867195.21481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867195.21486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867195.21489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.21491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867195.21508: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867195.21915: variable 'network_connections' from source: task vars 12033 1726867195.21926: variable 'port2_profile' from source: play vars 12033 1726867195.21991: variable 'port2_profile' from source: play vars 12033 1726867195.22005: variable 'port1_profile' from source: play vars 12033 1726867195.22066: variable 'port1_profile' from source: play vars 12033 1726867195.22082: variable 'controller_profile' from source: play vars 12033 1726867195.22147: variable 'controller_profile' from source: play vars 12033 1726867195.22179: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12033 1726867195.22203: when evaluation is False, skipping this task 12033 1726867195.22212: _execute() done 12033 1726867195.22220: dumping result to json 12033 1726867195.22228: done dumping result, returning 12033 1726867195.22241: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-74bb-502b-00000000069a] 12033 1726867195.22252: sending task result for task 0affcac9-a3a5-74bb-502b-00000000069a 12033 1726867195.22551: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000069a 12033 1726867195.22554: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12033 1726867195.22607: no more pending results, returning what we have 12033 1726867195.22611: results queue empty 12033 1726867195.22612: checking for any_errors_fatal 12033 1726867195.22622: done checking for any_errors_fatal 12033 1726867195.22623: checking for max_fail_percentage 12033 1726867195.22624: done checking for max_fail_percentage 12033 1726867195.22625: checking to see if all hosts have failed and the running result is not ok 12033 1726867195.22626: done checking to see if all hosts have failed 12033 1726867195.22627: getting the remaining hosts for this loop 12033 1726867195.22629: done getting the remaining hosts for this loop 12033 1726867195.22632: getting the next task for host managed_node3 12033 1726867195.22639: done getting next task for host managed_node3 12033 1726867195.22643: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12033 1726867195.22648: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867195.22666: getting variables 12033 1726867195.22668: in VariableManager get_vars() 12033 1726867195.22710: Calling all_inventory to load vars for managed_node3 12033 1726867195.22713: Calling groups_inventory to load vars for managed_node3 12033 1726867195.22715: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867195.22725: Calling all_plugins_play to load vars for managed_node3 12033 1726867195.22728: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867195.22732: Calling groups_plugins_play to load vars for managed_node3 12033 1726867195.24275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867195.28021: done with get_vars() 12033 1726867195.28055: done getting variables 12033 1726867195.28122: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:19:55 -0400 (0:00:00.122) 0:00:34.397 ****** 12033 1726867195.28165: entering _queue_task() for managed_node3/package 12033 1726867195.29207: worker is 1 (out of 1 available) 12033 1726867195.29221: exiting _queue_task() for managed_node3/package 12033 1726867195.29235: done queuing things up, now waiting for results queue to drain 12033 1726867195.29492: waiting for pending results... 12033 1726867195.30006: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 12033 1726867195.30332: in run() - task 0affcac9-a3a5-74bb-502b-00000000069b 12033 1726867195.30353: variable 'ansible_search_path' from source: unknown 12033 1726867195.30362: variable 'ansible_search_path' from source: unknown 12033 1726867195.30450: calling self._execute() 12033 1726867195.30665: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867195.30681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867195.30698: variable 'omit' from source: magic vars 12033 1726867195.31445: variable 'ansible_distribution_major_version' from source: facts 12033 1726867195.31632: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867195.31910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867195.32198: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867195.32249: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867195.32295: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867195.32369: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867195.32604: variable 'network_packages' from source: role '' defaults 12033 1726867195.32771: variable '__network_provider_setup' from source: role '' defaults 12033 1726867195.32783: variable '__network_service_name_default_nm' from source: role '' defaults 12033 1726867195.33056: variable '__network_service_name_default_nm' from source: role '' defaults 12033 1726867195.33059: variable '__network_packages_default_nm' from source: role '' defaults 12033 1726867195.33062: variable '__network_packages_default_nm' from source: role '' defaults 12033 1726867195.33735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867195.36156: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867195.36230: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867195.36279: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867195.36324: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867195.36352: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867195.36452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.36493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.36524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.36566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.36690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.36693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.36696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.36698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.36727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.36746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.36970: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12033 1726867195.37105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.37138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.37168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.37212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.37235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.37583: variable 'ansible_python' from source: facts 12033 1726867195.37586: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12033 1726867195.37589: variable '__network_wpa_supplicant_required' from source: role '' defaults 12033 1726867195.37664: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12033 1726867195.37940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.37968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.38198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.38241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.38310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.38401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.38618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.38622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.38624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.38627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.38969: variable 'network_connections' from source: task vars 12033 1726867195.38981: variable 'port2_profile' from source: play vars 12033 1726867195.39269: variable 'port2_profile' from source: play vars 12033 1726867195.39272: variable 'port1_profile' from source: play vars 12033 1726867195.39402: variable 'port1_profile' from source: play vars 12033 1726867195.39417: variable 'controller_profile' from source: play vars 12033 1726867195.39620: variable 'controller_profile' from source: play vars 12033 1726867195.39698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867195.39884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867195.39888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.39890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867195.40141: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867195.40623: variable 'network_connections' from source: task vars 12033 1726867195.40632: variable 'port2_profile' from source: play vars 12033 1726867195.40852: variable 'port2_profile' from source: play vars 12033 1726867195.40867: variable 'port1_profile' from source: play vars 12033 1726867195.41011: variable 'port1_profile' from source: play vars 12033 1726867195.41030: variable 'controller_profile' from source: play vars 12033 1726867195.41142: variable 'controller_profile' from source: play vars 12033 1726867195.41183: variable '__network_packages_default_wireless' from source: role '' defaults 12033 1726867195.41262: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867195.41563: variable 'network_connections' from source: task vars 12033 1726867195.41579: variable 'port2_profile' from source: play vars 12033 1726867195.41646: variable 'port2_profile' from source: play vars 12033 1726867195.41689: variable 'port1_profile' from source: play vars 12033 1726867195.41753: variable 'port1_profile' from source: play vars 12033 1726867195.41787: variable 'controller_profile' from source: play vars 12033 1726867195.41852: variable 'controller_profile' from source: play vars 12033 1726867195.41881: variable '__network_packages_default_team' from source: role '' defaults 12033 1726867195.41968: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867195.42338: variable 'network_connections' from source: task vars 12033 1726867195.42349: variable 'port2_profile' from source: play vars 12033 1726867195.42440: variable 'port2_profile' from source: play vars 12033 1726867195.42447: variable 'port1_profile' from source: play vars 12033 1726867195.42506: variable 'port1_profile' from source: play vars 12033 1726867195.42544: variable 'controller_profile' from source: play vars 12033 1726867195.42591: variable 'controller_profile' from source: play vars 12033 1726867195.42642: variable '__network_service_name_default_initscripts' from source: role '' defaults 12033 1726867195.42763: variable '__network_service_name_default_initscripts' from source: role '' defaults 12033 1726867195.42768: variable '__network_packages_default_initscripts' from source: role '' defaults 12033 1726867195.42800: variable '__network_packages_default_initscripts' from source: role '' defaults 12033 1726867195.43030: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12033 1726867195.43622: variable 'network_connections' from source: task vars 12033 1726867195.43632: variable 'port2_profile' from source: play vars 12033 1726867195.43723: variable 'port2_profile' from source: play vars 12033 1726867195.43737: variable 'port1_profile' from source: play vars 12033 1726867195.43866: variable 'port1_profile' from source: play vars 12033 1726867195.43869: variable 'controller_profile' from source: play vars 12033 1726867195.43889: variable 'controller_profile' from source: play vars 12033 1726867195.43902: variable 'ansible_distribution' from source: facts 12033 1726867195.43923: variable '__network_rh_distros' from source: role '' defaults 12033 1726867195.43935: variable 'ansible_distribution_major_version' from source: facts 12033 1726867195.43993: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12033 1726867195.44194: variable 'ansible_distribution' from source: facts 12033 1726867195.44197: variable '__network_rh_distros' from source: role '' defaults 12033 1726867195.44200: variable 'ansible_distribution_major_version' from source: facts 12033 1726867195.44204: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12033 1726867195.44393: variable 'ansible_distribution' from source: facts 12033 1726867195.44419: variable '__network_rh_distros' from source: role '' defaults 12033 1726867195.44520: variable 'ansible_distribution_major_version' from source: facts 12033 1726867195.44524: variable 'network_provider' from source: set_fact 12033 1726867195.44527: variable 'ansible_facts' from source: unknown 12033 1726867195.45758: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12033 1726867195.45839: when evaluation is False, skipping this task 12033 1726867195.45848: _execute() done 12033 1726867195.45855: dumping result to json 12033 1726867195.45862: done dumping result, returning 12033 1726867195.45874: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-74bb-502b-00000000069b] 12033 1726867195.45892: sending task result for task 0affcac9-a3a5-74bb-502b-00000000069b skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12033 1726867195.46228: no more pending results, returning what we have 12033 1726867195.46232: results queue empty 12033 1726867195.46233: checking for any_errors_fatal 12033 1726867195.46240: done checking for any_errors_fatal 12033 1726867195.46240: checking for max_fail_percentage 12033 1726867195.46242: done checking for max_fail_percentage 12033 1726867195.46243: checking to see if all hosts have failed and the running result is not ok 12033 1726867195.46244: done checking to see if all hosts have failed 12033 1726867195.46245: getting the remaining hosts for this loop 12033 1726867195.46247: done getting the remaining hosts for this loop 12033 1726867195.46250: getting the next task for host managed_node3 12033 1726867195.46257: done getting next task for host managed_node3 12033 1726867195.46266: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12033 1726867195.46271: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867195.46316: getting variables 12033 1726867195.46318: in VariableManager get_vars() 12033 1726867195.46389: Calling all_inventory to load vars for managed_node3 12033 1726867195.46391: Calling groups_inventory to load vars for managed_node3 12033 1726867195.46510: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867195.46521: Calling all_plugins_play to load vars for managed_node3 12033 1726867195.46524: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867195.46526: Calling groups_plugins_play to load vars for managed_node3 12033 1726867195.47130: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000069b 12033 1726867195.47133: WORKER PROCESS EXITING 12033 1726867195.48403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867195.50196: done with get_vars() 12033 1726867195.50218: done getting variables 12033 1726867195.50284: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:19:55 -0400 (0:00:00.221) 0:00:34.619 ****** 12033 1726867195.50326: entering _queue_task() for managed_node3/package 12033 1726867195.50655: worker is 1 (out of 1 available) 12033 1726867195.50669: exiting _queue_task() for managed_node3/package 12033 1726867195.50688: done queuing things up, now waiting for results queue to drain 12033 1726867195.50690: waiting for pending results... 12033 1726867195.50947: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12033 1726867195.51125: in run() - task 0affcac9-a3a5-74bb-502b-00000000069c 12033 1726867195.51146: variable 'ansible_search_path' from source: unknown 12033 1726867195.51155: variable 'ansible_search_path' from source: unknown 12033 1726867195.51199: calling self._execute() 12033 1726867195.51300: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867195.51483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867195.51487: variable 'omit' from source: magic vars 12033 1726867195.51879: variable 'ansible_distribution_major_version' from source: facts 12033 1726867195.51944: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867195.52345: variable 'network_state' from source: role '' defaults 12033 1726867195.52453: Evaluated conditional (network_state != {}): False 12033 1726867195.52456: when evaluation is False, skipping this task 12033 1726867195.52459: _execute() done 12033 1726867195.52461: dumping result to json 12033 1726867195.52463: done dumping result, returning 12033 1726867195.52466: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-74bb-502b-00000000069c] 12033 1726867195.52469: sending task result for task 0affcac9-a3a5-74bb-502b-00000000069c 12033 1726867195.52549: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000069c 12033 1726867195.52558: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867195.52607: no more pending results, returning what we have 12033 1726867195.52612: results queue empty 12033 1726867195.52613: checking for any_errors_fatal 12033 1726867195.52618: done checking for any_errors_fatal 12033 1726867195.52618: checking for max_fail_percentage 12033 1726867195.52620: done checking for max_fail_percentage 12033 1726867195.52621: checking to see if all hosts have failed and the running result is not ok 12033 1726867195.52621: done checking to see if all hosts have failed 12033 1726867195.52622: getting the remaining hosts for this loop 12033 1726867195.52624: done getting the remaining hosts for this loop 12033 1726867195.52627: getting the next task for host managed_node3 12033 1726867195.52634: done getting next task for host managed_node3 12033 1726867195.52637: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12033 1726867195.52643: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867195.52659: getting variables 12033 1726867195.52660: in VariableManager get_vars() 12033 1726867195.52702: Calling all_inventory to load vars for managed_node3 12033 1726867195.52705: Calling groups_inventory to load vars for managed_node3 12033 1726867195.52708: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867195.52719: Calling all_plugins_play to load vars for managed_node3 12033 1726867195.52722: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867195.52724: Calling groups_plugins_play to load vars for managed_node3 12033 1726867195.55969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867195.59404: done with get_vars() 12033 1726867195.59432: done getting variables 12033 1726867195.59495: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:19:55 -0400 (0:00:00.092) 0:00:34.711 ****** 12033 1726867195.59535: entering _queue_task() for managed_node3/package 12033 1726867195.60236: worker is 1 (out of 1 available) 12033 1726867195.60249: exiting _queue_task() for managed_node3/package 12033 1726867195.60262: done queuing things up, now waiting for results queue to drain 12033 1726867195.60264: waiting for pending results... 12033 1726867195.60761: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12033 1726867195.61237: in run() - task 0affcac9-a3a5-74bb-502b-00000000069d 12033 1726867195.61242: variable 'ansible_search_path' from source: unknown 12033 1726867195.61244: variable 'ansible_search_path' from source: unknown 12033 1726867195.61247: calling self._execute() 12033 1726867195.61562: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867195.61566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867195.61569: variable 'omit' from source: magic vars 12033 1726867195.62427: variable 'ansible_distribution_major_version' from source: facts 12033 1726867195.62431: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867195.62518: variable 'network_state' from source: role '' defaults 12033 1726867195.62697: Evaluated conditional (network_state != {}): False 12033 1726867195.62814: when evaluation is False, skipping this task 12033 1726867195.62818: _execute() done 12033 1726867195.62821: dumping result to json 12033 1726867195.62823: done dumping result, returning 12033 1726867195.62826: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-74bb-502b-00000000069d] 12033 1726867195.62828: sending task result for task 0affcac9-a3a5-74bb-502b-00000000069d 12033 1726867195.62911: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000069d 12033 1726867195.62918: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867195.62963: no more pending results, returning what we have 12033 1726867195.62967: results queue empty 12033 1726867195.62968: checking for any_errors_fatal 12033 1726867195.62979: done checking for any_errors_fatal 12033 1726867195.62980: checking for max_fail_percentage 12033 1726867195.62982: done checking for max_fail_percentage 12033 1726867195.62983: checking to see if all hosts have failed and the running result is not ok 12033 1726867195.62984: done checking to see if all hosts have failed 12033 1726867195.62985: getting the remaining hosts for this loop 12033 1726867195.62988: done getting the remaining hosts for this loop 12033 1726867195.62994: getting the next task for host managed_node3 12033 1726867195.63002: done getting next task for host managed_node3 12033 1726867195.63007: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12033 1726867195.63014: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867195.63033: getting variables 12033 1726867195.63035: in VariableManager get_vars() 12033 1726867195.63072: Calling all_inventory to load vars for managed_node3 12033 1726867195.63074: Calling groups_inventory to load vars for managed_node3 12033 1726867195.63076: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867195.63198: Calling all_plugins_play to load vars for managed_node3 12033 1726867195.63201: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867195.63204: Calling groups_plugins_play to load vars for managed_node3 12033 1726867195.66286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867195.70263: done with get_vars() 12033 1726867195.70299: done getting variables 12033 1726867195.70358: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:19:55 -0400 (0:00:00.110) 0:00:34.821 ****** 12033 1726867195.70606: entering _queue_task() for managed_node3/service 12033 1726867195.71172: worker is 1 (out of 1 available) 12033 1726867195.71386: exiting _queue_task() for managed_node3/service 12033 1726867195.71399: done queuing things up, now waiting for results queue to drain 12033 1726867195.71401: waiting for pending results... 12033 1726867195.71925: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12033 1726867195.72069: in run() - task 0affcac9-a3a5-74bb-502b-00000000069e 12033 1726867195.72146: variable 'ansible_search_path' from source: unknown 12033 1726867195.72345: variable 'ansible_search_path' from source: unknown 12033 1726867195.72349: calling self._execute() 12033 1726867195.72489: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867195.72505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867195.72521: variable 'omit' from source: magic vars 12033 1726867195.73436: variable 'ansible_distribution_major_version' from source: facts 12033 1726867195.73441: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867195.73684: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867195.74118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867195.78281: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867195.78572: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867195.78699: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867195.78849: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867195.78874: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867195.79018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.79052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.79076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.79119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.79134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.79299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.79323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.79348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.79525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.79528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.79558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.79798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.79852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.79859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.79874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.80217: variable 'network_connections' from source: task vars 12033 1726867195.80383: variable 'port2_profile' from source: play vars 12033 1726867195.80479: variable 'port2_profile' from source: play vars 12033 1726867195.80492: variable 'port1_profile' from source: play vars 12033 1726867195.80671: variable 'port1_profile' from source: play vars 12033 1726867195.80681: variable 'controller_profile' from source: play vars 12033 1726867195.80742: variable 'controller_profile' from source: play vars 12033 1726867195.80936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867195.81349: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867195.81388: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867195.81422: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867195.81513: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867195.81669: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867195.81696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867195.81720: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.81782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867195.81917: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867195.82362: variable 'network_connections' from source: task vars 12033 1726867195.82365: variable 'port2_profile' from source: play vars 12033 1726867195.82569: variable 'port2_profile' from source: play vars 12033 1726867195.82572: variable 'port1_profile' from source: play vars 12033 1726867195.82698: variable 'port1_profile' from source: play vars 12033 1726867195.82785: variable 'controller_profile' from source: play vars 12033 1726867195.82876: variable 'controller_profile' from source: play vars 12033 1726867195.82906: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12033 1726867195.82918: when evaluation is False, skipping this task 12033 1726867195.82921: _execute() done 12033 1726867195.82924: dumping result to json 12033 1726867195.82927: done dumping result, returning 12033 1726867195.82929: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-74bb-502b-00000000069e] 12033 1726867195.82931: sending task result for task 0affcac9-a3a5-74bb-502b-00000000069e 12033 1726867195.83258: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000069e 12033 1726867195.83262: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12033 1726867195.83315: no more pending results, returning what we have 12033 1726867195.83319: results queue empty 12033 1726867195.83320: checking for any_errors_fatal 12033 1726867195.83329: done checking for any_errors_fatal 12033 1726867195.83330: checking for max_fail_percentage 12033 1726867195.83332: done checking for max_fail_percentage 12033 1726867195.83333: checking to see if all hosts have failed and the running result is not ok 12033 1726867195.83334: done checking to see if all hosts have failed 12033 1726867195.83335: getting the remaining hosts for this loop 12033 1726867195.83337: done getting the remaining hosts for this loop 12033 1726867195.83340: getting the next task for host managed_node3 12033 1726867195.83349: done getting next task for host managed_node3 12033 1726867195.83353: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12033 1726867195.83358: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867195.83375: getting variables 12033 1726867195.83379: in VariableManager get_vars() 12033 1726867195.83422: Calling all_inventory to load vars for managed_node3 12033 1726867195.83424: Calling groups_inventory to load vars for managed_node3 12033 1726867195.83427: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867195.83437: Calling all_plugins_play to load vars for managed_node3 12033 1726867195.83440: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867195.83443: Calling groups_plugins_play to load vars for managed_node3 12033 1726867195.86352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867195.88873: done with get_vars() 12033 1726867195.88896: done getting variables 12033 1726867195.88963: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:19:55 -0400 (0:00:00.183) 0:00:35.005 ****** 12033 1726867195.89001: entering _queue_task() for managed_node3/service 12033 1726867195.89370: worker is 1 (out of 1 available) 12033 1726867195.89386: exiting _queue_task() for managed_node3/service 12033 1726867195.89399: done queuing things up, now waiting for results queue to drain 12033 1726867195.89401: waiting for pending results... 12033 1726867195.89983: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12033 1726867195.90107: in run() - task 0affcac9-a3a5-74bb-502b-00000000069f 12033 1726867195.90134: variable 'ansible_search_path' from source: unknown 12033 1726867195.90142: variable 'ansible_search_path' from source: unknown 12033 1726867195.90183: calling self._execute() 12033 1726867195.90289: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867195.90320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867195.90324: variable 'omit' from source: magic vars 12033 1726867195.90713: variable 'ansible_distribution_major_version' from source: facts 12033 1726867195.90755: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867195.90903: variable 'network_provider' from source: set_fact 12033 1726867195.90917: variable 'network_state' from source: role '' defaults 12033 1726867195.90931: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12033 1726867195.90972: variable 'omit' from source: magic vars 12033 1726867195.91020: variable 'omit' from source: magic vars 12033 1726867195.91052: variable 'network_service_name' from source: role '' defaults 12033 1726867195.91125: variable 'network_service_name' from source: role '' defaults 12033 1726867195.91241: variable '__network_provider_setup' from source: role '' defaults 12033 1726867195.91252: variable '__network_service_name_default_nm' from source: role '' defaults 12033 1726867195.91336: variable '__network_service_name_default_nm' from source: role '' defaults 12033 1726867195.91435: variable '__network_packages_default_nm' from source: role '' defaults 12033 1726867195.91438: variable '__network_packages_default_nm' from source: role '' defaults 12033 1726867195.91637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867195.94168: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867195.94253: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867195.94327: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867195.94394: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867195.94429: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867195.94512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.94641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.94644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.94646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.94648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.94674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.94701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.94726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.94770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.94789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.95020: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12033 1726867195.95137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.95160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.95187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.95227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.95241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.95330: variable 'ansible_python' from source: facts 12033 1726867195.95347: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12033 1726867195.95431: variable '__network_wpa_supplicant_required' from source: role '' defaults 12033 1726867195.95514: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12033 1726867195.95634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.95658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.95683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.95727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.95737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.95781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867195.95822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867195.95897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.95900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867195.95903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867195.96022: variable 'network_connections' from source: task vars 12033 1726867195.96029: variable 'port2_profile' from source: play vars 12033 1726867195.96107: variable 'port2_profile' from source: play vars 12033 1726867195.96323: variable 'port1_profile' from source: play vars 12033 1726867195.96326: variable 'port1_profile' from source: play vars 12033 1726867195.96328: variable 'controller_profile' from source: play vars 12033 1726867195.96365: variable 'controller_profile' from source: play vars 12033 1726867195.96513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867195.96804: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867195.96862: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867195.96947: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867195.97006: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867195.97093: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867195.97096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867195.97110: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867195.97142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867195.97386: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867195.97715: variable 'network_connections' from source: task vars 12033 1726867195.97722: variable 'port2_profile' from source: play vars 12033 1726867195.97806: variable 'port2_profile' from source: play vars 12033 1726867195.97817: variable 'port1_profile' from source: play vars 12033 1726867195.97888: variable 'port1_profile' from source: play vars 12033 1726867195.97912: variable 'controller_profile' from source: play vars 12033 1726867195.97986: variable 'controller_profile' from source: play vars 12033 1726867195.98023: variable '__network_packages_default_wireless' from source: role '' defaults 12033 1726867195.98282: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867195.98447: variable 'network_connections' from source: task vars 12033 1726867195.98485: variable 'port2_profile' from source: play vars 12033 1726867195.98686: variable 'port2_profile' from source: play vars 12033 1726867195.98689: variable 'port1_profile' from source: play vars 12033 1726867195.98691: variable 'port1_profile' from source: play vars 12033 1726867195.98693: variable 'controller_profile' from source: play vars 12033 1726867195.98704: variable 'controller_profile' from source: play vars 12033 1726867195.98730: variable '__network_packages_default_team' from source: role '' defaults 12033 1726867195.98816: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867195.99114: variable 'network_connections' from source: task vars 12033 1726867195.99130: variable 'port2_profile' from source: play vars 12033 1726867195.99195: variable 'port2_profile' from source: play vars 12033 1726867195.99198: variable 'port1_profile' from source: play vars 12033 1726867195.99252: variable 'port1_profile' from source: play vars 12033 1726867195.99258: variable 'controller_profile' from source: play vars 12033 1726867195.99310: variable 'controller_profile' from source: play vars 12033 1726867195.99369: variable '__network_service_name_default_initscripts' from source: role '' defaults 12033 1726867195.99413: variable '__network_service_name_default_initscripts' from source: role '' defaults 12033 1726867195.99419: variable '__network_packages_default_initscripts' from source: role '' defaults 12033 1726867195.99463: variable '__network_packages_default_initscripts' from source: role '' defaults 12033 1726867195.99598: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12033 1726867195.99933: variable 'network_connections' from source: task vars 12033 1726867195.99937: variable 'port2_profile' from source: play vars 12033 1726867195.99980: variable 'port2_profile' from source: play vars 12033 1726867195.99988: variable 'port1_profile' from source: play vars 12033 1726867196.00030: variable 'port1_profile' from source: play vars 12033 1726867196.00038: variable 'controller_profile' from source: play vars 12033 1726867196.00080: variable 'controller_profile' from source: play vars 12033 1726867196.00086: variable 'ansible_distribution' from source: facts 12033 1726867196.00089: variable '__network_rh_distros' from source: role '' defaults 12033 1726867196.00097: variable 'ansible_distribution_major_version' from source: facts 12033 1726867196.00116: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12033 1726867196.00225: variable 'ansible_distribution' from source: facts 12033 1726867196.00228: variable '__network_rh_distros' from source: role '' defaults 12033 1726867196.00231: variable 'ansible_distribution_major_version' from source: facts 12033 1726867196.00243: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12033 1726867196.00352: variable 'ansible_distribution' from source: facts 12033 1726867196.00355: variable '__network_rh_distros' from source: role '' defaults 12033 1726867196.00360: variable 'ansible_distribution_major_version' from source: facts 12033 1726867196.00386: variable 'network_provider' from source: set_fact 12033 1726867196.00405: variable 'omit' from source: magic vars 12033 1726867196.00429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867196.00452: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867196.00467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867196.00481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867196.00490: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867196.00514: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867196.00517: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867196.00520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867196.00590: Set connection var ansible_pipelining to False 12033 1726867196.00598: Set connection var ansible_shell_executable to /bin/sh 12033 1726867196.00605: Set connection var ansible_timeout to 10 12033 1726867196.00610: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867196.00612: Set connection var ansible_connection to ssh 12033 1726867196.00617: Set connection var ansible_shell_type to sh 12033 1726867196.00635: variable 'ansible_shell_executable' from source: unknown 12033 1726867196.00638: variable 'ansible_connection' from source: unknown 12033 1726867196.00640: variable 'ansible_module_compression' from source: unknown 12033 1726867196.00643: variable 'ansible_shell_type' from source: unknown 12033 1726867196.00645: variable 'ansible_shell_executable' from source: unknown 12033 1726867196.00649: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867196.00651: variable 'ansible_pipelining' from source: unknown 12033 1726867196.00653: variable 'ansible_timeout' from source: unknown 12033 1726867196.00660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867196.00731: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867196.00740: variable 'omit' from source: magic vars 12033 1726867196.00745: starting attempt loop 12033 1726867196.00747: running the handler 12033 1726867196.00806: variable 'ansible_facts' from source: unknown 12033 1726867196.01357: _low_level_execute_command(): starting 12033 1726867196.01361: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867196.01846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867196.01855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867196.01861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867196.01886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867196.01890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867196.01947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867196.01951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867196.01953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867196.02013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867196.03700: stdout chunk (state=3): >>>/root <<< 12033 1726867196.03798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867196.03834: stderr chunk (state=3): >>><<< 12033 1726867196.03837: stdout chunk (state=3): >>><<< 12033 1726867196.03846: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867196.03856: _low_level_execute_command(): starting 12033 1726867196.03862: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867196.0384617-13795-112147537101916 `" && echo ansible-tmp-1726867196.0384617-13795-112147537101916="` echo /root/.ansible/tmp/ansible-tmp-1726867196.0384617-13795-112147537101916 `" ) && sleep 0' 12033 1726867196.04261: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867196.04297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867196.04301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867196.04303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867196.04305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867196.04352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867196.04355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867196.04359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867196.04409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867196.06294: stdout chunk (state=3): >>>ansible-tmp-1726867196.0384617-13795-112147537101916=/root/.ansible/tmp/ansible-tmp-1726867196.0384617-13795-112147537101916 <<< 12033 1726867196.06398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867196.06421: stderr chunk (state=3): >>><<< 12033 1726867196.06424: stdout chunk (state=3): >>><<< 12033 1726867196.06438: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867196.0384617-13795-112147537101916=/root/.ansible/tmp/ansible-tmp-1726867196.0384617-13795-112147537101916 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867196.06464: variable 'ansible_module_compression' from source: unknown 12033 1726867196.06506: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 12033 1726867196.06557: variable 'ansible_facts' from source: unknown 12033 1726867196.06692: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867196.0384617-13795-112147537101916/AnsiballZ_systemd.py 12033 1726867196.06790: Sending initial data 12033 1726867196.06797: Sent initial data (156 bytes) 12033 1726867196.07238: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867196.07241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867196.07244: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867196.07246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867196.07296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867196.07299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867196.07349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867196.08898: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867196.08946: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867196.09004: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpqj3po2cz /root/.ansible/tmp/ansible-tmp-1726867196.0384617-13795-112147537101916/AnsiballZ_systemd.py <<< 12033 1726867196.09007: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867196.0384617-13795-112147537101916/AnsiballZ_systemd.py" <<< 12033 1726867196.09047: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpqj3po2cz" to remote "/root/.ansible/tmp/ansible-tmp-1726867196.0384617-13795-112147537101916/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867196.0384617-13795-112147537101916/AnsiballZ_systemd.py" <<< 12033 1726867196.10444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867196.10455: stderr chunk (state=3): >>><<< 12033 1726867196.10458: stdout chunk (state=3): >>><<< 12033 1726867196.10487: done transferring module to remote 12033 1726867196.10497: _low_level_execute_command(): starting 12033 1726867196.10501: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867196.0384617-13795-112147537101916/ /root/.ansible/tmp/ansible-tmp-1726867196.0384617-13795-112147537101916/AnsiballZ_systemd.py && sleep 0' 12033 1726867196.10946: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867196.10950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867196.10952: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867196.10954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867196.10956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867196.11005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867196.11011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867196.11055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867196.12787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867196.12813: stderr chunk (state=3): >>><<< 12033 1726867196.12816: stdout chunk (state=3): >>><<< 12033 1726867196.12828: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867196.12831: _low_level_execute_command(): starting 12033 1726867196.12836: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867196.0384617-13795-112147537101916/AnsiballZ_systemd.py && sleep 0' 12033 1726867196.13236: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867196.13245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867196.13266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867196.13273: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867196.13276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867196.13328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867196.13331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867196.13388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867196.42230: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10383360", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3310440448", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "675684000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 12033 1726867196.42275: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12033 1726867196.44320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867196.44428: stderr chunk (state=3): >>><<< 12033 1726867196.44432: stdout chunk (state=3): >>><<< 12033 1726867196.44435: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10383360", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3310440448", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "675684000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867196.44995: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867196.0384617-13795-112147537101916/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867196.44999: _low_level_execute_command(): starting 12033 1726867196.45001: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867196.0384617-13795-112147537101916/ > /dev/null 2>&1 && sleep 0' 12033 1726867196.46259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867196.46402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867196.46440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867196.48299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867196.48328: stdout chunk (state=3): >>><<< 12033 1726867196.48331: stderr chunk (state=3): >>><<< 12033 1726867196.48353: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867196.48383: handler run complete 12033 1726867196.48452: attempt loop complete, returning result 12033 1726867196.48460: _execute() done 12033 1726867196.48555: dumping result to json 12033 1726867196.48558: done dumping result, returning 12033 1726867196.48561: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-74bb-502b-00000000069f] 12033 1726867196.48563: sending task result for task 0affcac9-a3a5-74bb-502b-00000000069f ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867196.48874: no more pending results, returning what we have 12033 1726867196.48880: results queue empty 12033 1726867196.48881: checking for any_errors_fatal 12033 1726867196.48889: done checking for any_errors_fatal 12033 1726867196.48890: checking for max_fail_percentage 12033 1726867196.48895: done checking for max_fail_percentage 12033 1726867196.48896: checking to see if all hosts have failed and the running result is not ok 12033 1726867196.48897: done checking to see if all hosts have failed 12033 1726867196.48898: getting the remaining hosts for this loop 12033 1726867196.48899: done getting the remaining hosts for this loop 12033 1726867196.48902: getting the next task for host managed_node3 12033 1726867196.48910: done getting next task for host managed_node3 12033 1726867196.48914: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12033 1726867196.48920: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867196.48933: getting variables 12033 1726867196.48935: in VariableManager get_vars() 12033 1726867196.48970: Calling all_inventory to load vars for managed_node3 12033 1726867196.48972: Calling groups_inventory to load vars for managed_node3 12033 1726867196.48975: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867196.49199: Calling all_plugins_play to load vars for managed_node3 12033 1726867196.49203: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867196.49207: Calling groups_plugins_play to load vars for managed_node3 12033 1726867196.50045: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000069f 12033 1726867196.50049: WORKER PROCESS EXITING 12033 1726867196.52139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867196.55667: done with get_vars() 12033 1726867196.55839: done getting variables 12033 1726867196.56079: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:19:56 -0400 (0:00:00.671) 0:00:35.676 ****** 12033 1726867196.56123: entering _queue_task() for managed_node3/service 12033 1726867196.56951: worker is 1 (out of 1 available) 12033 1726867196.56964: exiting _queue_task() for managed_node3/service 12033 1726867196.57091: done queuing things up, now waiting for results queue to drain 12033 1726867196.57093: waiting for pending results... 12033 1726867196.57588: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12033 1726867196.57962: in run() - task 0affcac9-a3a5-74bb-502b-0000000006a0 12033 1726867196.57975: variable 'ansible_search_path' from source: unknown 12033 1726867196.57981: variable 'ansible_search_path' from source: unknown 12033 1726867196.58017: calling self._execute() 12033 1726867196.58387: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867196.58391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867196.58394: variable 'omit' from source: magic vars 12033 1726867196.59212: variable 'ansible_distribution_major_version' from source: facts 12033 1726867196.59216: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867196.59245: variable 'network_provider' from source: set_fact 12033 1726867196.59250: Evaluated conditional (network_provider == "nm"): True 12033 1726867196.59646: variable '__network_wpa_supplicant_required' from source: role '' defaults 12033 1726867196.59733: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12033 1726867196.60100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867196.64916: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867196.64973: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867196.65214: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867196.65246: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867196.65271: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867196.65513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867196.65517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867196.65608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867196.65646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867196.65660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867196.65712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867196.65734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867196.65758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867196.65998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867196.66013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867196.66056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867196.66072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867196.66099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867196.66134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867196.66148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867196.66602: variable 'network_connections' from source: task vars 12033 1726867196.66605: variable 'port2_profile' from source: play vars 12033 1726867196.66608: variable 'port2_profile' from source: play vars 12033 1726867196.66610: variable 'port1_profile' from source: play vars 12033 1726867196.66744: variable 'port1_profile' from source: play vars 12033 1726867196.66753: variable 'controller_profile' from source: play vars 12033 1726867196.66816: variable 'controller_profile' from source: play vars 12033 1726867196.67252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867196.67382: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867196.67618: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867196.67622: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867196.67651: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867196.67694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867196.67717: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867196.67741: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867196.67765: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867196.68118: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867196.68485: variable 'network_connections' from source: task vars 12033 1726867196.68488: variable 'port2_profile' from source: play vars 12033 1726867196.68519: variable 'port2_profile' from source: play vars 12033 1726867196.68526: variable 'port1_profile' from source: play vars 12033 1726867196.68593: variable 'port1_profile' from source: play vars 12033 1726867196.68810: variable 'controller_profile' from source: play vars 12033 1726867196.68852: variable 'controller_profile' from source: play vars 12033 1726867196.68882: Evaluated conditional (__network_wpa_supplicant_required): False 12033 1726867196.68886: when evaluation is False, skipping this task 12033 1726867196.68888: _execute() done 12033 1726867196.68891: dumping result to json 12033 1726867196.68893: done dumping result, returning 12033 1726867196.68902: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-74bb-502b-0000000006a0] 12033 1726867196.68907: sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a0 12033 1726867196.69483: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a0 12033 1726867196.69487: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12033 1726867196.69530: no more pending results, returning what we have 12033 1726867196.69533: results queue empty 12033 1726867196.69534: checking for any_errors_fatal 12033 1726867196.69552: done checking for any_errors_fatal 12033 1726867196.69554: checking for max_fail_percentage 12033 1726867196.69556: done checking for max_fail_percentage 12033 1726867196.69557: checking to see if all hosts have failed and the running result is not ok 12033 1726867196.69557: done checking to see if all hosts have failed 12033 1726867196.69558: getting the remaining hosts for this loop 12033 1726867196.69560: done getting the remaining hosts for this loop 12033 1726867196.69563: getting the next task for host managed_node3 12033 1726867196.69571: done getting next task for host managed_node3 12033 1726867196.69574: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12033 1726867196.69581: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867196.69598: getting variables 12033 1726867196.69600: in VariableManager get_vars() 12033 1726867196.69633: Calling all_inventory to load vars for managed_node3 12033 1726867196.69635: Calling groups_inventory to load vars for managed_node3 12033 1726867196.69638: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867196.69646: Calling all_plugins_play to load vars for managed_node3 12033 1726867196.69648: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867196.69650: Calling groups_plugins_play to load vars for managed_node3 12033 1726867196.74223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867196.80083: done with get_vars() 12033 1726867196.80114: done getting variables 12033 1726867196.80174: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:19:56 -0400 (0:00:00.240) 0:00:35.917 ****** 12033 1726867196.80213: entering _queue_task() for managed_node3/service 12033 1726867196.81367: worker is 1 (out of 1 available) 12033 1726867196.81382: exiting _queue_task() for managed_node3/service 12033 1726867196.81396: done queuing things up, now waiting for results queue to drain 12033 1726867196.81398: waiting for pending results... 12033 1726867196.82197: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 12033 1726867196.82330: in run() - task 0affcac9-a3a5-74bb-502b-0000000006a1 12033 1726867196.82345: variable 'ansible_search_path' from source: unknown 12033 1726867196.82349: variable 'ansible_search_path' from source: unknown 12033 1726867196.82384: calling self._execute() 12033 1726867196.82478: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867196.82688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867196.82702: variable 'omit' from source: magic vars 12033 1726867196.83271: variable 'ansible_distribution_major_version' from source: facts 12033 1726867196.83487: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867196.83606: variable 'network_provider' from source: set_fact 12033 1726867196.83609: Evaluated conditional (network_provider == "initscripts"): False 12033 1726867196.83612: when evaluation is False, skipping this task 12033 1726867196.83615: _execute() done 12033 1726867196.83618: dumping result to json 12033 1726867196.83622: done dumping result, returning 12033 1726867196.83629: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-74bb-502b-0000000006a1] 12033 1726867196.83634: sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a1 12033 1726867196.83735: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a1 12033 1726867196.83738: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867196.83784: no more pending results, returning what we have 12033 1726867196.83788: results queue empty 12033 1726867196.83789: checking for any_errors_fatal 12033 1726867196.83796: done checking for any_errors_fatal 12033 1726867196.83798: checking for max_fail_percentage 12033 1726867196.83800: done checking for max_fail_percentage 12033 1726867196.83801: checking to see if all hosts have failed and the running result is not ok 12033 1726867196.83802: done checking to see if all hosts have failed 12033 1726867196.83803: getting the remaining hosts for this loop 12033 1726867196.83805: done getting the remaining hosts for this loop 12033 1726867196.83808: getting the next task for host managed_node3 12033 1726867196.83817: done getting next task for host managed_node3 12033 1726867196.83820: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12033 1726867196.83826: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867196.83844: getting variables 12033 1726867196.83846: in VariableManager get_vars() 12033 1726867196.83886: Calling all_inventory to load vars for managed_node3 12033 1726867196.83889: Calling groups_inventory to load vars for managed_node3 12033 1726867196.83891: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867196.83903: Calling all_plugins_play to load vars for managed_node3 12033 1726867196.83905: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867196.83908: Calling groups_plugins_play to load vars for managed_node3 12033 1726867196.87737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867196.91034: done with get_vars() 12033 1726867196.91059: done getting variables 12033 1726867196.91419: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:19:56 -0400 (0:00:00.112) 0:00:36.030 ****** 12033 1726867196.91452: entering _queue_task() for managed_node3/copy 12033 1726867196.92408: worker is 1 (out of 1 available) 12033 1726867196.92417: exiting _queue_task() for managed_node3/copy 12033 1726867196.92426: done queuing things up, now waiting for results queue to drain 12033 1726867196.92428: waiting for pending results... 12033 1726867196.92681: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12033 1726867196.93026: in run() - task 0affcac9-a3a5-74bb-502b-0000000006a2 12033 1726867196.93310: variable 'ansible_search_path' from source: unknown 12033 1726867196.93315: variable 'ansible_search_path' from source: unknown 12033 1726867196.93318: calling self._execute() 12033 1726867196.93321: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867196.93323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867196.93325: variable 'omit' from source: magic vars 12033 1726867196.93944: variable 'ansible_distribution_major_version' from source: facts 12033 1726867196.93958: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867196.94274: variable 'network_provider' from source: set_fact 12033 1726867196.94280: Evaluated conditional (network_provider == "initscripts"): False 12033 1726867196.94291: when evaluation is False, skipping this task 12033 1726867196.94294: _execute() done 12033 1726867196.94297: dumping result to json 12033 1726867196.94300: done dumping result, returning 12033 1726867196.94304: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-74bb-502b-0000000006a2] 12033 1726867196.94400: sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a2 12033 1726867196.94476: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a2 12033 1726867196.94482: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12033 1726867196.94551: no more pending results, returning what we have 12033 1726867196.94555: results queue empty 12033 1726867196.94556: checking for any_errors_fatal 12033 1726867196.94562: done checking for any_errors_fatal 12033 1726867196.94563: checking for max_fail_percentage 12033 1726867196.94566: done checking for max_fail_percentage 12033 1726867196.94567: checking to see if all hosts have failed and the running result is not ok 12033 1726867196.94567: done checking to see if all hosts have failed 12033 1726867196.94568: getting the remaining hosts for this loop 12033 1726867196.94570: done getting the remaining hosts for this loop 12033 1726867196.94574: getting the next task for host managed_node3 12033 1726867196.94583: done getting next task for host managed_node3 12033 1726867196.94587: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12033 1726867196.94592: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867196.94609: getting variables 12033 1726867196.94611: in VariableManager get_vars() 12033 1726867196.94646: Calling all_inventory to load vars for managed_node3 12033 1726867196.94649: Calling groups_inventory to load vars for managed_node3 12033 1726867196.94651: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867196.94662: Calling all_plugins_play to load vars for managed_node3 12033 1726867196.94664: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867196.94667: Calling groups_plugins_play to load vars for managed_node3 12033 1726867196.98852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867197.02840: done with get_vars() 12033 1726867197.02868: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:19:57 -0400 (0:00:00.116) 0:00:36.146 ****** 12033 1726867197.03069: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 12033 1726867197.04091: worker is 1 (out of 1 available) 12033 1726867197.04105: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 12033 1726867197.04118: done queuing things up, now waiting for results queue to drain 12033 1726867197.04120: waiting for pending results... 12033 1726867197.04698: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12033 1726867197.04994: in run() - task 0affcac9-a3a5-74bb-502b-0000000006a3 12033 1726867197.05013: variable 'ansible_search_path' from source: unknown 12033 1726867197.05016: variable 'ansible_search_path' from source: unknown 12033 1726867197.05050: calling self._execute() 12033 1726867197.05230: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867197.05234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867197.05237: variable 'omit' from source: magic vars 12033 1726867197.05924: variable 'ansible_distribution_major_version' from source: facts 12033 1726867197.05936: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867197.05944: variable 'omit' from source: magic vars 12033 1726867197.06285: variable 'omit' from source: magic vars 12033 1726867197.06373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867197.10571: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867197.10628: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867197.10665: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867197.11007: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867197.11035: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867197.11116: variable 'network_provider' from source: set_fact 12033 1726867197.11393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867197.11423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867197.11448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867197.11592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867197.11612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867197.11689: variable 'omit' from source: magic vars 12033 1726867197.12070: variable 'omit' from source: magic vars 12033 1726867197.12093: variable 'network_connections' from source: task vars 12033 1726867197.12108: variable 'port2_profile' from source: play vars 12033 1726867197.12165: variable 'port2_profile' from source: play vars 12033 1726867197.12175: variable 'port1_profile' from source: play vars 12033 1726867197.12440: variable 'port1_profile' from source: play vars 12033 1726867197.12451: variable 'controller_profile' from source: play vars 12033 1726867197.12511: variable 'controller_profile' from source: play vars 12033 1726867197.12874: variable 'omit' from source: magic vars 12033 1726867197.12884: variable '__lsr_ansible_managed' from source: task vars 12033 1726867197.12942: variable '__lsr_ansible_managed' from source: task vars 12033 1726867197.13333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12033 1726867197.13885: Loaded config def from plugin (lookup/template) 12033 1726867197.13888: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12033 1726867197.13891: File lookup term: get_ansible_managed.j2 12033 1726867197.13985: variable 'ansible_search_path' from source: unknown 12033 1726867197.13993: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12033 1726867197.14104: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12033 1726867197.14108: variable 'ansible_search_path' from source: unknown 12033 1726867197.26613: variable 'ansible_managed' from source: unknown 12033 1726867197.26733: variable 'omit' from source: magic vars 12033 1726867197.26755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867197.26782: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867197.27007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867197.27024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867197.27034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867197.27064: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867197.27067: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867197.27069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867197.27279: Set connection var ansible_pipelining to False 12033 1726867197.27283: Set connection var ansible_shell_executable to /bin/sh 12033 1726867197.27492: Set connection var ansible_timeout to 10 12033 1726867197.27496: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867197.27498: Set connection var ansible_connection to ssh 12033 1726867197.27500: Set connection var ansible_shell_type to sh 12033 1726867197.27502: variable 'ansible_shell_executable' from source: unknown 12033 1726867197.27506: variable 'ansible_connection' from source: unknown 12033 1726867197.27508: variable 'ansible_module_compression' from source: unknown 12033 1726867197.27510: variable 'ansible_shell_type' from source: unknown 12033 1726867197.27512: variable 'ansible_shell_executable' from source: unknown 12033 1726867197.27514: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867197.27515: variable 'ansible_pipelining' from source: unknown 12033 1726867197.27517: variable 'ansible_timeout' from source: unknown 12033 1726867197.27527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867197.27579: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867197.27793: variable 'omit' from source: magic vars 12033 1726867197.27802: starting attempt loop 12033 1726867197.27806: running the handler 12033 1726867197.27817: _low_level_execute_command(): starting 12033 1726867197.27825: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867197.29187: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867197.29298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867197.29376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867197.31127: stdout chunk (state=3): >>>/root <<< 12033 1726867197.31345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867197.31348: stdout chunk (state=3): >>><<< 12033 1726867197.31355: stderr chunk (state=3): >>><<< 12033 1726867197.31375: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867197.31672: _low_level_execute_command(): starting 12033 1726867197.31678: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867197.3137467-13859-119127317170596 `" && echo ansible-tmp-1726867197.3137467-13859-119127317170596="` echo /root/.ansible/tmp/ansible-tmp-1726867197.3137467-13859-119127317170596 `" ) && sleep 0' 12033 1726867197.32587: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867197.32882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867197.32907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867197.32917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867197.33051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867197.35184: stdout chunk (state=3): >>>ansible-tmp-1726867197.3137467-13859-119127317170596=/root/.ansible/tmp/ansible-tmp-1726867197.3137467-13859-119127317170596 <<< 12033 1726867197.35189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867197.35192: stderr chunk (state=3): >>><<< 12033 1726867197.35194: stdout chunk (state=3): >>><<< 12033 1726867197.35196: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867197.3137467-13859-119127317170596=/root/.ansible/tmp/ansible-tmp-1726867197.3137467-13859-119127317170596 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867197.35198: variable 'ansible_module_compression' from source: unknown 12033 1726867197.35312: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 12033 1726867197.35381: variable 'ansible_facts' from source: unknown 12033 1726867197.35614: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867197.3137467-13859-119127317170596/AnsiballZ_network_connections.py 12033 1726867197.36601: Sending initial data 12033 1726867197.36604: Sent initial data (168 bytes) 12033 1726867197.36892: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867197.36983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867197.38600: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867197.38659: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867197.38702: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpkzqzif26 /root/.ansible/tmp/ansible-tmp-1726867197.3137467-13859-119127317170596/AnsiballZ_network_connections.py <<< 12033 1726867197.38706: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867197.3137467-13859-119127317170596/AnsiballZ_network_connections.py" <<< 12033 1726867197.38765: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpkzqzif26" to remote "/root/.ansible/tmp/ansible-tmp-1726867197.3137467-13859-119127317170596/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867197.3137467-13859-119127317170596/AnsiballZ_network_connections.py" <<< 12033 1726867197.41132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867197.41136: stdout chunk (state=3): >>><<< 12033 1726867197.41142: stderr chunk (state=3): >>><<< 12033 1726867197.41175: done transferring module to remote 12033 1726867197.41189: _low_level_execute_command(): starting 12033 1726867197.41192: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867197.3137467-13859-119127317170596/ /root/.ansible/tmp/ansible-tmp-1726867197.3137467-13859-119127317170596/AnsiballZ_network_connections.py && sleep 0' 12033 1726867197.42358: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867197.42361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867197.42482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867197.42486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867197.42579: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867197.42614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867197.42631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867197.42769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867197.44768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867197.44986: stdout chunk (state=3): >>><<< 12033 1726867197.44990: stderr chunk (state=3): >>><<< 12033 1726867197.44993: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867197.44995: _low_level_execute_command(): starting 12033 1726867197.44997: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867197.3137467-13859-119127317170596/AnsiballZ_network_connections.py && sleep 0' 12033 1726867197.46055: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867197.46094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867197.46109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867197.46123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867197.46136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867197.46143: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867197.46153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867197.46168: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867197.46369: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867197.46381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867197.46580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867197.46586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867198.05027: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_9a1q1d18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_9a1q1d18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/5369e4ef-1a37-4dbb-886a-05f2f96cb3c2: error=unknown <<< 12033 1726867198.06914: stdout chunk (state=3): >>>Traceback (most recent call last):<<< 12033 1726867198.06920: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_9a1q1d18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back<<< 12033 1726867198.06924: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_9a1q1d18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail<<< 12033 1726867198.06928: stdout chunk (state=3): >>> ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/014cff9e-2499-4f69-88d3-e2ba3869747a: error=unknown <<< 12033 1726867198.08729: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_9a1q1d18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_9a1q1d18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/de3a6889-a8bc-4195-81cc-ec6220008b92: error=unknown <<< 12033 1726867198.08768: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12033 1726867198.10635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867198.10639: stdout chunk (state=3): >>><<< 12033 1726867198.10641: stderr chunk (state=3): >>><<< 12033 1726867198.10660: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_9a1q1d18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_9a1q1d18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/5369e4ef-1a37-4dbb-886a-05f2f96cb3c2: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_9a1q1d18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_9a1q1d18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/014cff9e-2499-4f69-88d3-e2ba3869747a: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_9a1q1d18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_9a1q1d18/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/de3a6889-a8bc-4195-81cc-ec6220008b92: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867198.10710: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867197.3137467-13859-119127317170596/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867198.10719: _low_level_execute_command(): starting 12033 1726867198.10725: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867197.3137467-13859-119127317170596/ > /dev/null 2>&1 && sleep 0' 12033 1726867198.12218: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867198.12221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867198.12224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867198.12226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867198.12228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867198.12230: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867198.12232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867198.12234: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867198.12392: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867198.12407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867198.12474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867198.14382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867198.14386: stdout chunk (state=3): >>><<< 12033 1726867198.14394: stderr chunk (state=3): >>><<< 12033 1726867198.14412: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867198.14419: handler run complete 12033 1726867198.14454: attempt loop complete, returning result 12033 1726867198.14457: _execute() done 12033 1726867198.14460: dumping result to json 12033 1726867198.14462: done dumping result, returning 12033 1726867198.14470: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-74bb-502b-0000000006a3] 12033 1726867198.14475: sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a3 12033 1726867198.14602: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a3 12033 1726867198.14605: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 12033 1726867198.14715: no more pending results, returning what we have 12033 1726867198.14718: results queue empty 12033 1726867198.14719: checking for any_errors_fatal 12033 1726867198.14725: done checking for any_errors_fatal 12033 1726867198.14726: checking for max_fail_percentage 12033 1726867198.14728: done checking for max_fail_percentage 12033 1726867198.14729: checking to see if all hosts have failed and the running result is not ok 12033 1726867198.14730: done checking to see if all hosts have failed 12033 1726867198.14731: getting the remaining hosts for this loop 12033 1726867198.14732: done getting the remaining hosts for this loop 12033 1726867198.14735: getting the next task for host managed_node3 12033 1726867198.14744: done getting next task for host managed_node3 12033 1726867198.14747: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12033 1726867198.14752: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867198.14763: getting variables 12033 1726867198.14764: in VariableManager get_vars() 12033 1726867198.14915: Calling all_inventory to load vars for managed_node3 12033 1726867198.14917: Calling groups_inventory to load vars for managed_node3 12033 1726867198.14920: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867198.14929: Calling all_plugins_play to load vars for managed_node3 12033 1726867198.14932: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867198.14935: Calling groups_plugins_play to load vars for managed_node3 12033 1726867198.17236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867198.20309: done with get_vars() 12033 1726867198.20335: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:19:58 -0400 (0:00:01.173) 0:00:37.320 ****** 12033 1726867198.20453: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 12033 1726867198.20848: worker is 1 (out of 1 available) 12033 1726867198.20861: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 12033 1726867198.20885: done queuing things up, now waiting for results queue to drain 12033 1726867198.20887: waiting for pending results... 12033 1726867198.21322: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 12033 1726867198.21462: in run() - task 0affcac9-a3a5-74bb-502b-0000000006a4 12033 1726867198.21489: variable 'ansible_search_path' from source: unknown 12033 1726867198.21502: variable 'ansible_search_path' from source: unknown 12033 1726867198.21550: calling self._execute() 12033 1726867198.21659: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867198.21672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867198.21691: variable 'omit' from source: magic vars 12033 1726867198.22136: variable 'ansible_distribution_major_version' from source: facts 12033 1726867198.22154: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867198.22294: variable 'network_state' from source: role '' defaults 12033 1726867198.22316: Evaluated conditional (network_state != {}): False 12033 1726867198.22324: when evaluation is False, skipping this task 12033 1726867198.22331: _execute() done 12033 1726867198.22354: dumping result to json 12033 1726867198.22356: done dumping result, returning 12033 1726867198.22359: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-74bb-502b-0000000006a4] 12033 1726867198.22365: sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a4 12033 1726867198.22595: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a4 12033 1726867198.22601: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867198.22662: no more pending results, returning what we have 12033 1726867198.22667: results queue empty 12033 1726867198.22668: checking for any_errors_fatal 12033 1726867198.22790: done checking for any_errors_fatal 12033 1726867198.22794: checking for max_fail_percentage 12033 1726867198.22797: done checking for max_fail_percentage 12033 1726867198.22798: checking to see if all hosts have failed and the running result is not ok 12033 1726867198.22799: done checking to see if all hosts have failed 12033 1726867198.22799: getting the remaining hosts for this loop 12033 1726867198.22801: done getting the remaining hosts for this loop 12033 1726867198.22804: getting the next task for host managed_node3 12033 1726867198.22812: done getting next task for host managed_node3 12033 1726867198.22815: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12033 1726867198.22821: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867198.22841: getting variables 12033 1726867198.22843: in VariableManager get_vars() 12033 1726867198.23010: Calling all_inventory to load vars for managed_node3 12033 1726867198.23014: Calling groups_inventory to load vars for managed_node3 12033 1726867198.23016: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867198.23026: Calling all_plugins_play to load vars for managed_node3 12033 1726867198.23029: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867198.23031: Calling groups_plugins_play to load vars for managed_node3 12033 1726867198.24444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867198.26776: done with get_vars() 12033 1726867198.26840: done getting variables 12033 1726867198.26907: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:19:58 -0400 (0:00:00.064) 0:00:37.385 ****** 12033 1726867198.26946: entering _queue_task() for managed_node3/debug 12033 1726867198.27722: worker is 1 (out of 1 available) 12033 1726867198.27735: exiting _queue_task() for managed_node3/debug 12033 1726867198.27755: done queuing things up, now waiting for results queue to drain 12033 1726867198.27757: waiting for pending results... 12033 1726867198.28217: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12033 1726867198.28626: in run() - task 0affcac9-a3a5-74bb-502b-0000000006a5 12033 1726867198.28639: variable 'ansible_search_path' from source: unknown 12033 1726867198.28643: variable 'ansible_search_path' from source: unknown 12033 1726867198.28681: calling self._execute() 12033 1726867198.28766: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867198.28770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867198.28781: variable 'omit' from source: magic vars 12033 1726867198.29728: variable 'ansible_distribution_major_version' from source: facts 12033 1726867198.29740: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867198.29749: variable 'omit' from source: magic vars 12033 1726867198.30132: variable 'omit' from source: magic vars 12033 1726867198.30171: variable 'omit' from source: magic vars 12033 1726867198.30205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867198.30238: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867198.30260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867198.30499: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867198.30502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867198.30582: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867198.30586: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867198.30588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867198.30629: Set connection var ansible_pipelining to False 12033 1726867198.30638: Set connection var ansible_shell_executable to /bin/sh 12033 1726867198.30645: Set connection var ansible_timeout to 10 12033 1726867198.30652: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867198.30654: Set connection var ansible_connection to ssh 12033 1726867198.30660: Set connection var ansible_shell_type to sh 12033 1726867198.30886: variable 'ansible_shell_executable' from source: unknown 12033 1726867198.30889: variable 'ansible_connection' from source: unknown 12033 1726867198.30895: variable 'ansible_module_compression' from source: unknown 12033 1726867198.30898: variable 'ansible_shell_type' from source: unknown 12033 1726867198.30900: variable 'ansible_shell_executable' from source: unknown 12033 1726867198.30902: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867198.30905: variable 'ansible_pipelining' from source: unknown 12033 1726867198.30907: variable 'ansible_timeout' from source: unknown 12033 1726867198.30937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867198.31284: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867198.31288: variable 'omit' from source: magic vars 12033 1726867198.31290: starting attempt loop 12033 1726867198.31296: running the handler 12033 1726867198.31298: variable '__network_connections_result' from source: set_fact 12033 1726867198.31358: handler run complete 12033 1726867198.31374: attempt loop complete, returning result 12033 1726867198.31378: _execute() done 12033 1726867198.31381: dumping result to json 12033 1726867198.31809: done dumping result, returning 12033 1726867198.31859: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-74bb-502b-0000000006a5] 12033 1726867198.31862: sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a5 ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 12033 1726867198.32112: no more pending results, returning what we have 12033 1726867198.32116: results queue empty 12033 1726867198.32118: checking for any_errors_fatal 12033 1726867198.32128: done checking for any_errors_fatal 12033 1726867198.32129: checking for max_fail_percentage 12033 1726867198.32131: done checking for max_fail_percentage 12033 1726867198.32132: checking to see if all hosts have failed and the running result is not ok 12033 1726867198.32133: done checking to see if all hosts have failed 12033 1726867198.32134: getting the remaining hosts for this loop 12033 1726867198.32135: done getting the remaining hosts for this loop 12033 1726867198.32139: getting the next task for host managed_node3 12033 1726867198.32147: done getting next task for host managed_node3 12033 1726867198.32151: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12033 1726867198.32157: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867198.32169: getting variables 12033 1726867198.32171: in VariableManager get_vars() 12033 1726867198.32321: Calling all_inventory to load vars for managed_node3 12033 1726867198.32324: Calling groups_inventory to load vars for managed_node3 12033 1726867198.32326: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867198.32415: Calling all_plugins_play to load vars for managed_node3 12033 1726867198.32423: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867198.32466: Calling groups_plugins_play to load vars for managed_node3 12033 1726867198.33295: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a5 12033 1726867198.33299: WORKER PROCESS EXITING 12033 1726867198.37152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867198.41232: done with get_vars() 12033 1726867198.41285: done getting variables 12033 1726867198.41347: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:19:58 -0400 (0:00:00.146) 0:00:37.531 ****** 12033 1726867198.41608: entering _queue_task() for managed_node3/debug 12033 1726867198.42425: worker is 1 (out of 1 available) 12033 1726867198.42562: exiting _queue_task() for managed_node3/debug 12033 1726867198.42579: done queuing things up, now waiting for results queue to drain 12033 1726867198.42581: waiting for pending results... 12033 1726867198.43269: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12033 1726867198.43844: in run() - task 0affcac9-a3a5-74bb-502b-0000000006a6 12033 1726867198.43980: variable 'ansible_search_path' from source: unknown 12033 1726867198.43986: variable 'ansible_search_path' from source: unknown 12033 1726867198.43990: calling self._execute() 12033 1726867198.44412: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867198.44417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867198.44428: variable 'omit' from source: magic vars 12033 1726867198.46068: variable 'ansible_distribution_major_version' from source: facts 12033 1726867198.46071: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867198.46074: variable 'omit' from source: magic vars 12033 1726867198.46079: variable 'omit' from source: magic vars 12033 1726867198.46500: variable 'omit' from source: magic vars 12033 1726867198.46540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867198.46581: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867198.46623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867198.46643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867198.46655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867198.47052: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867198.47056: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867198.47059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867198.47202: Set connection var ansible_pipelining to False 12033 1726867198.47211: Set connection var ansible_shell_executable to /bin/sh 12033 1726867198.47219: Set connection var ansible_timeout to 10 12033 1726867198.47225: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867198.47228: Set connection var ansible_connection to ssh 12033 1726867198.47233: Set connection var ansible_shell_type to sh 12033 1726867198.47258: variable 'ansible_shell_executable' from source: unknown 12033 1726867198.47262: variable 'ansible_connection' from source: unknown 12033 1726867198.47267: variable 'ansible_module_compression' from source: unknown 12033 1726867198.47269: variable 'ansible_shell_type' from source: unknown 12033 1726867198.47271: variable 'ansible_shell_executable' from source: unknown 12033 1726867198.47284: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867198.47287: variable 'ansible_pipelining' from source: unknown 12033 1726867198.47289: variable 'ansible_timeout' from source: unknown 12033 1726867198.47387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867198.47548: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867198.47584: variable 'omit' from source: magic vars 12033 1726867198.47593: starting attempt loop 12033 1726867198.47599: running the handler 12033 1726867198.47934: variable '__network_connections_result' from source: set_fact 12033 1726867198.47997: variable '__network_connections_result' from source: set_fact 12033 1726867198.48342: handler run complete 12033 1726867198.48346: attempt loop complete, returning result 12033 1726867198.48349: _execute() done 12033 1726867198.48351: dumping result to json 12033 1726867198.48358: done dumping result, returning 12033 1726867198.48379: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-74bb-502b-0000000006a6] 12033 1726867198.48383: sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a6 12033 1726867198.48456: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a6 12033 1726867198.48458: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 12033 1726867198.48675: no more pending results, returning what we have 12033 1726867198.48681: results queue empty 12033 1726867198.48683: checking for any_errors_fatal 12033 1726867198.48693: done checking for any_errors_fatal 12033 1726867198.48694: checking for max_fail_percentage 12033 1726867198.48696: done checking for max_fail_percentage 12033 1726867198.48696: checking to see if all hosts have failed and the running result is not ok 12033 1726867198.48697: done checking to see if all hosts have failed 12033 1726867198.48698: getting the remaining hosts for this loop 12033 1726867198.48700: done getting the remaining hosts for this loop 12033 1726867198.48703: getting the next task for host managed_node3 12033 1726867198.48710: done getting next task for host managed_node3 12033 1726867198.48713: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12033 1726867198.48717: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867198.48728: getting variables 12033 1726867198.48729: in VariableManager get_vars() 12033 1726867198.48763: Calling all_inventory to load vars for managed_node3 12033 1726867198.48771: Calling groups_inventory to load vars for managed_node3 12033 1726867198.48774: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867198.49004: Calling all_plugins_play to load vars for managed_node3 12033 1726867198.49007: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867198.49011: Calling groups_plugins_play to load vars for managed_node3 12033 1726867198.52026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867198.54327: done with get_vars() 12033 1726867198.54349: done getting variables 12033 1726867198.54407: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:19:58 -0400 (0:00:00.128) 0:00:37.660 ****** 12033 1726867198.54443: entering _queue_task() for managed_node3/debug 12033 1726867198.54752: worker is 1 (out of 1 available) 12033 1726867198.54764: exiting _queue_task() for managed_node3/debug 12033 1726867198.54776: done queuing things up, now waiting for results queue to drain 12033 1726867198.54780: waiting for pending results... 12033 1726867198.55213: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12033 1726867198.55219: in run() - task 0affcac9-a3a5-74bb-502b-0000000006a7 12033 1726867198.55362: variable 'ansible_search_path' from source: unknown 12033 1726867198.55366: variable 'ansible_search_path' from source: unknown 12033 1726867198.55369: calling self._execute() 12033 1726867198.55372: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867198.55376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867198.55380: variable 'omit' from source: magic vars 12033 1726867198.55963: variable 'ansible_distribution_major_version' from source: facts 12033 1726867198.55968: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867198.56119: variable 'network_state' from source: role '' defaults 12033 1726867198.56122: Evaluated conditional (network_state != {}): False 12033 1726867198.56125: when evaluation is False, skipping this task 12033 1726867198.56127: _execute() done 12033 1726867198.56129: dumping result to json 12033 1726867198.56131: done dumping result, returning 12033 1726867198.56134: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-74bb-502b-0000000006a7] 12033 1726867198.56136: sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a7 12033 1726867198.56293: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a7 12033 1726867198.56296: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 12033 1726867198.56341: no more pending results, returning what we have 12033 1726867198.56345: results queue empty 12033 1726867198.56346: checking for any_errors_fatal 12033 1726867198.56354: done checking for any_errors_fatal 12033 1726867198.56355: checking for max_fail_percentage 12033 1726867198.56357: done checking for max_fail_percentage 12033 1726867198.56358: checking to see if all hosts have failed and the running result is not ok 12033 1726867198.56358: done checking to see if all hosts have failed 12033 1726867198.56359: getting the remaining hosts for this loop 12033 1726867198.56361: done getting the remaining hosts for this loop 12033 1726867198.56365: getting the next task for host managed_node3 12033 1726867198.56372: done getting next task for host managed_node3 12033 1726867198.56375: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12033 1726867198.56484: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867198.56500: getting variables 12033 1726867198.56502: in VariableManager get_vars() 12033 1726867198.56532: Calling all_inventory to load vars for managed_node3 12033 1726867198.56535: Calling groups_inventory to load vars for managed_node3 12033 1726867198.56537: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867198.56545: Calling all_plugins_play to load vars for managed_node3 12033 1726867198.56547: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867198.56550: Calling groups_plugins_play to load vars for managed_node3 12033 1726867198.58062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867198.59651: done with get_vars() 12033 1726867198.59671: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:19:58 -0400 (0:00:00.053) 0:00:37.713 ****** 12033 1726867198.59771: entering _queue_task() for managed_node3/ping 12033 1726867198.60171: worker is 1 (out of 1 available) 12033 1726867198.60186: exiting _queue_task() for managed_node3/ping 12033 1726867198.60199: done queuing things up, now waiting for results queue to drain 12033 1726867198.60200: waiting for pending results... 12033 1726867198.60492: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 12033 1726867198.60688: in run() - task 0affcac9-a3a5-74bb-502b-0000000006a8 12033 1726867198.60693: variable 'ansible_search_path' from source: unknown 12033 1726867198.60696: variable 'ansible_search_path' from source: unknown 12033 1726867198.60699: calling self._execute() 12033 1726867198.60794: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867198.60798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867198.60800: variable 'omit' from source: magic vars 12033 1726867198.61224: variable 'ansible_distribution_major_version' from source: facts 12033 1726867198.61228: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867198.61230: variable 'omit' from source: magic vars 12033 1726867198.61332: variable 'omit' from source: magic vars 12033 1726867198.61336: variable 'omit' from source: magic vars 12033 1726867198.61340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867198.61362: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867198.61382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867198.61409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867198.61420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867198.61467: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867198.61470: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867198.61472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867198.61574: Set connection var ansible_pipelining to False 12033 1726867198.61579: Set connection var ansible_shell_executable to /bin/sh 12033 1726867198.61582: Set connection var ansible_timeout to 10 12033 1726867198.61584: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867198.61597: Set connection var ansible_connection to ssh 12033 1726867198.61599: Set connection var ansible_shell_type to sh 12033 1726867198.61623: variable 'ansible_shell_executable' from source: unknown 12033 1726867198.61626: variable 'ansible_connection' from source: unknown 12033 1726867198.61629: variable 'ansible_module_compression' from source: unknown 12033 1726867198.61631: variable 'ansible_shell_type' from source: unknown 12033 1726867198.61633: variable 'ansible_shell_executable' from source: unknown 12033 1726867198.61635: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867198.61639: variable 'ansible_pipelining' from source: unknown 12033 1726867198.61641: variable 'ansible_timeout' from source: unknown 12033 1726867198.61643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867198.61879: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867198.61884: variable 'omit' from source: magic vars 12033 1726867198.61889: starting attempt loop 12033 1726867198.61892: running the handler 12033 1726867198.62192: _low_level_execute_command(): starting 12033 1726867198.62195: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867198.62781: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867198.62944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867198.62955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867198.62971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867198.62992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867198.63001: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867198.63131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867198.63190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867198.63300: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867198.63318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867198.63396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867198.65099: stdout chunk (state=3): >>>/root <<< 12033 1726867198.65383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867198.65387: stderr chunk (state=3): >>><<< 12033 1726867198.65389: stdout chunk (state=3): >>><<< 12033 1726867198.65392: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867198.65395: _low_level_execute_command(): starting 12033 1726867198.65398: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867198.6527388-13907-177593935496885 `" && echo ansible-tmp-1726867198.6527388-13907-177593935496885="` echo /root/.ansible/tmp/ansible-tmp-1726867198.6527388-13907-177593935496885 `" ) && sleep 0' 12033 1726867198.65902: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867198.65912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867198.65922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867198.65934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867198.65945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867198.66118: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867198.66122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867198.66125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867198.66127: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867198.66142: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867198.66145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867198.66147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867198.66150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867198.66152: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867198.66154: stderr chunk (state=3): >>>debug2: match found <<< 12033 1726867198.66156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867198.66158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867198.66159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867198.66161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867198.66358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867198.68253: stdout chunk (state=3): >>>ansible-tmp-1726867198.6527388-13907-177593935496885=/root/.ansible/tmp/ansible-tmp-1726867198.6527388-13907-177593935496885 <<< 12033 1726867198.68419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867198.68422: stdout chunk (state=3): >>><<< 12033 1726867198.68425: stderr chunk (state=3): >>><<< 12033 1726867198.68597: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867198.6527388-13907-177593935496885=/root/.ansible/tmp/ansible-tmp-1726867198.6527388-13907-177593935496885 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867198.68600: variable 'ansible_module_compression' from source: unknown 12033 1726867198.68603: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 12033 1726867198.68605: variable 'ansible_facts' from source: unknown 12033 1726867198.68713: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867198.6527388-13907-177593935496885/AnsiballZ_ping.py 12033 1726867198.68918: Sending initial data 12033 1726867198.68921: Sent initial data (153 bytes) 12033 1726867198.69707: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867198.69758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867198.69783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867198.69811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867198.69961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867198.71448: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12033 1726867198.71494: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867198.71522: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867198.71575: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmp52d9fxsh /root/.ansible/tmp/ansible-tmp-1726867198.6527388-13907-177593935496885/AnsiballZ_ping.py <<< 12033 1726867198.71581: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867198.6527388-13907-177593935496885/AnsiballZ_ping.py" <<< 12033 1726867198.71639: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmp52d9fxsh" to remote "/root/.ansible/tmp/ansible-tmp-1726867198.6527388-13907-177593935496885/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867198.6527388-13907-177593935496885/AnsiballZ_ping.py" <<< 12033 1726867198.72581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867198.72585: stdout chunk (state=3): >>><<< 12033 1726867198.72587: stderr chunk (state=3): >>><<< 12033 1726867198.72589: done transferring module to remote 12033 1726867198.72594: _low_level_execute_command(): starting 12033 1726867198.72597: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867198.6527388-13907-177593935496885/ /root/.ansible/tmp/ansible-tmp-1726867198.6527388-13907-177593935496885/AnsiballZ_ping.py && sleep 0' 12033 1726867198.73356: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867198.73372: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867198.73398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867198.73483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867198.75211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867198.75224: stderr chunk (state=3): >>><<< 12033 1726867198.75228: stdout chunk (state=3): >>><<< 12033 1726867198.75241: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867198.75244: _low_level_execute_command(): starting 12033 1726867198.75248: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867198.6527388-13907-177593935496885/AnsiballZ_ping.py && sleep 0' 12033 1726867198.75641: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867198.75646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867198.75648: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867198.75651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867198.75653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867198.75700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867198.75704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867198.75755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867198.90985: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12033 1726867198.92161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867198.92164: stdout chunk (state=3): >>><<< 12033 1726867198.92167: stderr chunk (state=3): >>><<< 12033 1726867198.92169: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867198.92173: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867198.6527388-13907-177593935496885/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867198.92188: _low_level_execute_command(): starting 12033 1726867198.92194: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867198.6527388-13907-177593935496885/ > /dev/null 2>&1 && sleep 0' 12033 1726867198.93360: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867198.93364: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867198.93366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867198.93369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867198.93371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867198.93433: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867198.93518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867198.93522: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867198.93525: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867198.93527: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867198.93534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867198.93543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867198.93550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867198.93552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867198.93554: stderr chunk (state=3): >>>debug2: match found <<< 12033 1726867198.93556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867198.93558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867198.93756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867198.93767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867198.93791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867198.95883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867198.95887: stdout chunk (state=3): >>><<< 12033 1726867198.95890: stderr chunk (state=3): >>><<< 12033 1726867198.95892: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867198.95894: handler run complete 12033 1726867198.95897: attempt loop complete, returning result 12033 1726867198.95899: _execute() done 12033 1726867198.95902: dumping result to json 12033 1726867198.95904: done dumping result, returning 12033 1726867198.95907: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-74bb-502b-0000000006a8] 12033 1726867198.95909: sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a8 12033 1726867198.95980: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000006a8 12033 1726867198.95984: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 12033 1726867198.96059: no more pending results, returning what we have 12033 1726867198.96065: results queue empty 12033 1726867198.96066: checking for any_errors_fatal 12033 1726867198.96072: done checking for any_errors_fatal 12033 1726867198.96073: checking for max_fail_percentage 12033 1726867198.96075: done checking for max_fail_percentage 12033 1726867198.96076: checking to see if all hosts have failed and the running result is not ok 12033 1726867198.96078: done checking to see if all hosts have failed 12033 1726867198.96079: getting the remaining hosts for this loop 12033 1726867198.96083: done getting the remaining hosts for this loop 12033 1726867198.96086: getting the next task for host managed_node3 12033 1726867198.96098: done getting next task for host managed_node3 12033 1726867198.96099: ^ task is: TASK: meta (role_complete) 12033 1726867198.96105: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867198.96115: getting variables 12033 1726867198.96117: in VariableManager get_vars() 12033 1726867198.96153: Calling all_inventory to load vars for managed_node3 12033 1726867198.96156: Calling groups_inventory to load vars for managed_node3 12033 1726867198.96158: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867198.96166: Calling all_plugins_play to load vars for managed_node3 12033 1726867198.96169: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867198.96171: Calling groups_plugins_play to load vars for managed_node3 12033 1726867198.99570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867199.02348: done with get_vars() 12033 1726867199.02371: done getting variables 12033 1726867199.02456: done queuing things up, now waiting for results queue to drain 12033 1726867199.02459: results queue empty 12033 1726867199.02460: checking for any_errors_fatal 12033 1726867199.02462: done checking for any_errors_fatal 12033 1726867199.02463: checking for max_fail_percentage 12033 1726867199.02464: done checking for max_fail_percentage 12033 1726867199.02465: checking to see if all hosts have failed and the running result is not ok 12033 1726867199.02465: done checking to see if all hosts have failed 12033 1726867199.02466: getting the remaining hosts for this loop 12033 1726867199.02467: done getting the remaining hosts for this loop 12033 1726867199.02470: getting the next task for host managed_node3 12033 1726867199.02474: done getting next task for host managed_node3 12033 1726867199.02679: ^ task is: TASK: Delete the device '{{ controller_device }}' 12033 1726867199.02684: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867199.02688: getting variables 12033 1726867199.02689: in VariableManager get_vars() 12033 1726867199.02703: Calling all_inventory to load vars for managed_node3 12033 1726867199.02705: Calling groups_inventory to load vars for managed_node3 12033 1726867199.02708: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867199.02717: Calling all_plugins_play to load vars for managed_node3 12033 1726867199.02719: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867199.02723: Calling groups_plugins_play to load vars for managed_node3 12033 1726867199.05520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867199.08851: done with get_vars() 12033 1726867199.08875: done getting variables 12033 1726867199.08920: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867199.09242: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml:22 Friday 20 September 2024 17:19:59 -0400 (0:00:00.495) 0:00:38.208 ****** 12033 1726867199.09274: entering _queue_task() for managed_node3/command 12033 1726867199.10516: worker is 1 (out of 1 available) 12033 1726867199.10525: exiting _queue_task() for managed_node3/command 12033 1726867199.10535: done queuing things up, now waiting for results queue to drain 12033 1726867199.10537: waiting for pending results... 12033 1726867199.10899: running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' 12033 1726867199.11188: in run() - task 0affcac9-a3a5-74bb-502b-0000000006d8 12033 1726867199.11191: variable 'ansible_search_path' from source: unknown 12033 1726867199.11194: variable 'ansible_search_path' from source: unknown 12033 1726867199.11197: calling self._execute() 12033 1726867199.11312: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867199.11324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867199.11415: variable 'omit' from source: magic vars 12033 1726867199.12289: variable 'ansible_distribution_major_version' from source: facts 12033 1726867199.12487: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867199.12493: variable 'omit' from source: magic vars 12033 1726867199.12496: variable 'omit' from source: magic vars 12033 1726867199.12921: variable 'controller_device' from source: play vars 12033 1726867199.12925: variable 'omit' from source: magic vars 12033 1726867199.12927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867199.12931: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867199.13106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867199.13466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867199.13469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867199.13472: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867199.13475: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867199.13480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867199.13652: Set connection var ansible_pipelining to False 12033 1726867199.14084: Set connection var ansible_shell_executable to /bin/sh 12033 1726867199.14087: Set connection var ansible_timeout to 10 12033 1726867199.14090: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867199.14096: Set connection var ansible_connection to ssh 12033 1726867199.14098: Set connection var ansible_shell_type to sh 12033 1726867199.14100: variable 'ansible_shell_executable' from source: unknown 12033 1726867199.14103: variable 'ansible_connection' from source: unknown 12033 1726867199.14105: variable 'ansible_module_compression' from source: unknown 12033 1726867199.14107: variable 'ansible_shell_type' from source: unknown 12033 1726867199.14110: variable 'ansible_shell_executable' from source: unknown 12033 1726867199.14112: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867199.14114: variable 'ansible_pipelining' from source: unknown 12033 1726867199.14115: variable 'ansible_timeout' from source: unknown 12033 1726867199.14117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867199.14412: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867199.14886: variable 'omit' from source: magic vars 12033 1726867199.14890: starting attempt loop 12033 1726867199.14895: running the handler 12033 1726867199.14898: _low_level_execute_command(): starting 12033 1726867199.14900: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867199.16130: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867199.16147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867199.16280: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867199.16301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867199.16316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867199.16418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867199.16845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867199.16848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867199.18462: stdout chunk (state=3): >>>/root <<< 12033 1726867199.18589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867199.18597: stdout chunk (state=3): >>><<< 12033 1726867199.18604: stderr chunk (state=3): >>><<< 12033 1726867199.18623: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867199.18694: _low_level_execute_command(): starting 12033 1726867199.18698: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867199.1862326-13929-281242176100327 `" && echo ansible-tmp-1726867199.1862326-13929-281242176100327="` echo /root/.ansible/tmp/ansible-tmp-1726867199.1862326-13929-281242176100327 `" ) && sleep 0' 12033 1726867199.20186: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867199.20189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867199.20192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867199.20194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867199.20299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867199.20500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867199.20571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867199.22495: stdout chunk (state=3): >>>ansible-tmp-1726867199.1862326-13929-281242176100327=/root/.ansible/tmp/ansible-tmp-1726867199.1862326-13929-281242176100327 <<< 12033 1726867199.22689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867199.22692: stdout chunk (state=3): >>><<< 12033 1726867199.22695: stderr chunk (state=3): >>><<< 12033 1726867199.22697: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867199.1862326-13929-281242176100327=/root/.ansible/tmp/ansible-tmp-1726867199.1862326-13929-281242176100327 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867199.22709: variable 'ansible_module_compression' from source: unknown 12033 1726867199.22761: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867199.22799: variable 'ansible_facts' from source: unknown 12033 1726867199.22894: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867199.1862326-13929-281242176100327/AnsiballZ_command.py 12033 1726867199.23030: Sending initial data 12033 1726867199.23042: Sent initial data (156 bytes) 12033 1726867199.23447: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867199.23450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867199.23453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867199.23456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867199.23457: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867199.23512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867199.23515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867199.23556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867199.25114: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867199.25166: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867199.25240: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpwdsz16nl /root/.ansible/tmp/ansible-tmp-1726867199.1862326-13929-281242176100327/AnsiballZ_command.py <<< 12033 1726867199.25244: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867199.1862326-13929-281242176100327/AnsiballZ_command.py" <<< 12033 1726867199.25289: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpwdsz16nl" to remote "/root/.ansible/tmp/ansible-tmp-1726867199.1862326-13929-281242176100327/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867199.1862326-13929-281242176100327/AnsiballZ_command.py" <<< 12033 1726867199.26193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867199.26197: stderr chunk (state=3): >>><<< 12033 1726867199.26199: stdout chunk (state=3): >>><<< 12033 1726867199.26202: done transferring module to remote 12033 1726867199.26204: _low_level_execute_command(): starting 12033 1726867199.26207: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867199.1862326-13929-281242176100327/ /root/.ansible/tmp/ansible-tmp-1726867199.1862326-13929-281242176100327/AnsiballZ_command.py && sleep 0' 12033 1726867199.26783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867199.26799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867199.26888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867199.26925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867199.26941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867199.26955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867199.27025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867199.28898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867199.28901: stderr chunk (state=3): >>><<< 12033 1726867199.28904: stdout chunk (state=3): >>><<< 12033 1726867199.28915: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867199.28918: _low_level_execute_command(): starting 12033 1726867199.28922: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867199.1862326-13929-281242176100327/AnsiballZ_command.py && sleep 0' 12033 1726867199.29371: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867199.29409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867199.29419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867199.29422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867199.29474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867199.29505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867199.29559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867199.45688: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 17:19:59.447200", "end": "2024-09-20 17:19:59.454429", "delta": "0:00:00.007229", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867199.47115: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. <<< 12033 1726867199.47147: stderr chunk (state=3): >>><<< 12033 1726867199.47150: stdout chunk (state=3): >>><<< 12033 1726867199.47169: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 17:19:59.447200", "end": "2024-09-20 17:19:59.454429", "delta": "0:00:00.007229", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. 12033 1726867199.47264: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867199.1862326-13929-281242176100327/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867199.47269: _low_level_execute_command(): starting 12033 1726867199.47272: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867199.1862326-13929-281242176100327/ > /dev/null 2>&1 && sleep 0' 12033 1726867199.47912: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867199.48055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867199.48064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867199.48108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867199.49997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867199.50025: stderr chunk (state=3): >>><<< 12033 1726867199.50029: stdout chunk (state=3): >>><<< 12033 1726867199.50062: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867199.50066: handler run complete 12033 1726867199.50171: Evaluated conditional (False): False 12033 1726867199.50175: Evaluated conditional (False): False 12033 1726867199.50180: attempt loop complete, returning result 12033 1726867199.50183: _execute() done 12033 1726867199.50185: dumping result to json 12033 1726867199.50187: done dumping result, returning 12033 1726867199.50189: done running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' [0affcac9-a3a5-74bb-502b-0000000006d8] 12033 1726867199.50193: sending task result for task 0affcac9-a3a5-74bb-502b-0000000006d8 12033 1726867199.50404: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000006d8 12033 1726867199.50409: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007229", "end": "2024-09-20 17:19:59.454429", "failed_when_result": false, "rc": 1, "start": "2024-09-20 17:19:59.447200" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 12033 1726867199.50496: no more pending results, returning what we have 12033 1726867199.50499: results queue empty 12033 1726867199.50500: checking for any_errors_fatal 12033 1726867199.50502: done checking for any_errors_fatal 12033 1726867199.50502: checking for max_fail_percentage 12033 1726867199.50504: done checking for max_fail_percentage 12033 1726867199.50505: checking to see if all hosts have failed and the running result is not ok 12033 1726867199.50506: done checking to see if all hosts have failed 12033 1726867199.50509: getting the remaining hosts for this loop 12033 1726867199.50512: done getting the remaining hosts for this loop 12033 1726867199.50515: getting the next task for host managed_node3 12033 1726867199.50527: done getting next task for host managed_node3 12033 1726867199.50530: ^ task is: TASK: Remove test interfaces 12033 1726867199.50534: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867199.50539: getting variables 12033 1726867199.50540: in VariableManager get_vars() 12033 1726867199.50792: Calling all_inventory to load vars for managed_node3 12033 1726867199.50795: Calling groups_inventory to load vars for managed_node3 12033 1726867199.50798: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867199.50814: Calling all_plugins_play to load vars for managed_node3 12033 1726867199.50818: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867199.50822: Calling groups_plugins_play to load vars for managed_node3 12033 1726867199.52894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867199.54156: done with get_vars() 12033 1726867199.54176: done getting variables 12033 1726867199.54220: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 17:19:59 -0400 (0:00:00.449) 0:00:38.658 ****** 12033 1726867199.54242: entering _queue_task() for managed_node3/shell 12033 1726867199.54486: worker is 1 (out of 1 available) 12033 1726867199.54498: exiting _queue_task() for managed_node3/shell 12033 1726867199.54510: done queuing things up, now waiting for results queue to drain 12033 1726867199.54511: waiting for pending results... 12033 1726867199.54690: running TaskExecutor() for managed_node3/TASK: Remove test interfaces 12033 1726867199.54762: in run() - task 0affcac9-a3a5-74bb-502b-0000000006de 12033 1726867199.54773: variable 'ansible_search_path' from source: unknown 12033 1726867199.54778: variable 'ansible_search_path' from source: unknown 12033 1726867199.54808: calling self._execute() 12033 1726867199.54883: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867199.54887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867199.54899: variable 'omit' from source: magic vars 12033 1726867199.55209: variable 'ansible_distribution_major_version' from source: facts 12033 1726867199.55213: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867199.55220: variable 'omit' from source: magic vars 12033 1726867199.55268: variable 'omit' from source: magic vars 12033 1726867199.55583: variable 'dhcp_interface1' from source: play vars 12033 1726867199.55587: variable 'dhcp_interface2' from source: play vars 12033 1726867199.55589: variable 'omit' from source: magic vars 12033 1726867199.55986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867199.55991: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867199.55993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867199.55996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867199.55998: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867199.56001: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867199.56003: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867199.56005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867199.56008: Set connection var ansible_pipelining to False 12033 1726867199.56010: Set connection var ansible_shell_executable to /bin/sh 12033 1726867199.56012: Set connection var ansible_timeout to 10 12033 1726867199.56014: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867199.56015: Set connection var ansible_connection to ssh 12033 1726867199.56017: Set connection var ansible_shell_type to sh 12033 1726867199.56020: variable 'ansible_shell_executable' from source: unknown 12033 1726867199.56021: variable 'ansible_connection' from source: unknown 12033 1726867199.56023: variable 'ansible_module_compression' from source: unknown 12033 1726867199.56025: variable 'ansible_shell_type' from source: unknown 12033 1726867199.56027: variable 'ansible_shell_executable' from source: unknown 12033 1726867199.56029: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867199.56031: variable 'ansible_pipelining' from source: unknown 12033 1726867199.56033: variable 'ansible_timeout' from source: unknown 12033 1726867199.56035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867199.56216: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867199.56227: variable 'omit' from source: magic vars 12033 1726867199.56236: starting attempt loop 12033 1726867199.56239: running the handler 12033 1726867199.56242: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867199.56265: _low_level_execute_command(): starting 12033 1726867199.56268: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867199.57043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867199.57059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867199.57070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867199.57087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867199.57103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867199.57110: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867199.57120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867199.57133: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867199.57199: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867199.57227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867199.57241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867199.57259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867199.57330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867199.59183: stdout chunk (state=3): >>>/root <<< 12033 1726867199.59259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867199.59321: stderr chunk (state=3): >>><<< 12033 1726867199.59325: stdout chunk (state=3): >>><<< 12033 1726867199.59348: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867199.59360: _low_level_execute_command(): starting 12033 1726867199.59367: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867199.5934682-13957-121692167351964 `" && echo ansible-tmp-1726867199.5934682-13957-121692167351964="` echo /root/.ansible/tmp/ansible-tmp-1726867199.5934682-13957-121692167351964 `" ) && sleep 0' 12033 1726867199.60328: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867199.60338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867199.60371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867199.60432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867199.60503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867199.60548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867199.60551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867199.60604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867199.62494: stdout chunk (state=3): >>>ansible-tmp-1726867199.5934682-13957-121692167351964=/root/.ansible/tmp/ansible-tmp-1726867199.5934682-13957-121692167351964 <<< 12033 1726867199.62596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867199.62648: stderr chunk (state=3): >>><<< 12033 1726867199.62687: stdout chunk (state=3): >>><<< 12033 1726867199.62885: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867199.5934682-13957-121692167351964=/root/.ansible/tmp/ansible-tmp-1726867199.5934682-13957-121692167351964 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867199.62888: variable 'ansible_module_compression' from source: unknown 12033 1726867199.62890: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867199.62893: variable 'ansible_facts' from source: unknown 12033 1726867199.63300: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867199.5934682-13957-121692167351964/AnsiballZ_command.py 12033 1726867199.64030: Sending initial data 12033 1726867199.64035: Sent initial data (156 bytes) 12033 1726867199.65492: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867199.65555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867199.65559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867199.65617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867199.65776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867199.67349: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12033 1726867199.67356: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 12033 1726867199.67364: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 12033 1726867199.67371: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 12033 1726867199.67398: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867199.67456: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867199.67504: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpkysys2zv /root/.ansible/tmp/ansible-tmp-1726867199.5934682-13957-121692167351964/AnsiballZ_command.py <<< 12033 1726867199.67507: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867199.5934682-13957-121692167351964/AnsiballZ_command.py" <<< 12033 1726867199.67616: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpkysys2zv" to remote "/root/.ansible/tmp/ansible-tmp-1726867199.5934682-13957-121692167351964/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867199.5934682-13957-121692167351964/AnsiballZ_command.py" <<< 12033 1726867199.68955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867199.68957: stdout chunk (state=3): >>><<< 12033 1726867199.68959: stderr chunk (state=3): >>><<< 12033 1726867199.68994: done transferring module to remote 12033 1726867199.69010: _low_level_execute_command(): starting 12033 1726867199.69013: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867199.5934682-13957-121692167351964/ /root/.ansible/tmp/ansible-tmp-1726867199.5934682-13957-121692167351964/AnsiballZ_command.py && sleep 0' 12033 1726867199.69823: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867199.69829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867199.69850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867199.69893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867199.69896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867199.69898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867199.69902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867199.70045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867199.70111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867199.70208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867199.72023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867199.72026: stdout chunk (state=3): >>><<< 12033 1726867199.72028: stderr chunk (state=3): >>><<< 12033 1726867199.72108: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867199.72111: _low_level_execute_command(): starting 12033 1726867199.72113: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867199.5934682-13957-121692167351964/AnsiballZ_command.py && sleep 0' 12033 1726867199.72917: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867199.72930: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867199.72997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867199.73002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867199.73074: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867199.73103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867199.73142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867199.73230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867199.92090: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 17:19:59.881056", "end": "2024-09-20 17:19:59.916229", "delta": "0:00:00.035173", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867199.93562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867199.93572: stdout chunk (state=3): >>><<< 12033 1726867199.93588: stderr chunk (state=3): >>><<< 12033 1726867199.93634: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 17:19:59.881056", "end": "2024-09-20 17:19:59.916229", "delta": "0:00:00.035173", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867199.93831: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867199.5934682-13957-121692167351964/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867199.93834: _low_level_execute_command(): starting 12033 1726867199.93837: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867199.5934682-13957-121692167351964/ > /dev/null 2>&1 && sleep 0' 12033 1726867199.95066: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867199.95166: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867199.95300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867199.95343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867199.95360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867199.95708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867199.97563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867199.97573: stdout chunk (state=3): >>><<< 12033 1726867199.97693: stderr chunk (state=3): >>><<< 12033 1726867199.97697: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867199.97699: handler run complete 12033 1726867199.97701: Evaluated conditional (False): False 12033 1726867199.97703: attempt loop complete, returning result 12033 1726867199.97705: _execute() done 12033 1726867199.97828: dumping result to json 12033 1726867199.97831: done dumping result, returning 12033 1726867199.97833: done running TaskExecutor() for managed_node3/TASK: Remove test interfaces [0affcac9-a3a5-74bb-502b-0000000006de] 12033 1726867199.97836: sending task result for task 0affcac9-a3a5-74bb-502b-0000000006de 12033 1726867199.98137: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000006de 12033 1726867199.98141: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.035173", "end": "2024-09-20 17:19:59.916229", "rc": 0, "start": "2024-09-20 17:19:59.881056" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 12033 1726867199.98246: no more pending results, returning what we have 12033 1726867199.98252: results queue empty 12033 1726867199.98254: checking for any_errors_fatal 12033 1726867199.98265: done checking for any_errors_fatal 12033 1726867199.98266: checking for max_fail_percentage 12033 1726867199.98268: done checking for max_fail_percentage 12033 1726867199.98269: checking to see if all hosts have failed and the running result is not ok 12033 1726867199.98270: done checking to see if all hosts have failed 12033 1726867199.98271: getting the remaining hosts for this loop 12033 1726867199.98273: done getting the remaining hosts for this loop 12033 1726867199.98279: getting the next task for host managed_node3 12033 1726867199.98511: done getting next task for host managed_node3 12033 1726867199.98514: ^ task is: TASK: Stop dnsmasq/radvd services 12033 1726867199.98518: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867199.98524: getting variables 12033 1726867199.98526: in VariableManager get_vars() 12033 1726867199.98568: Calling all_inventory to load vars for managed_node3 12033 1726867199.98571: Calling groups_inventory to load vars for managed_node3 12033 1726867199.98574: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867199.98874: Calling all_plugins_play to load vars for managed_node3 12033 1726867199.98880: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867199.98884: Calling groups_plugins_play to load vars for managed_node3 12033 1726867200.00620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867200.02465: done with get_vars() 12033 1726867200.02492: done getting variables 12033 1726867200.02551: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 17:20:00 -0400 (0:00:00.483) 0:00:39.141 ****** 12033 1726867200.02594: entering _queue_task() for managed_node3/shell 12033 1726867200.03481: worker is 1 (out of 1 available) 12033 1726867200.03492: exiting _queue_task() for managed_node3/shell 12033 1726867200.03504: done queuing things up, now waiting for results queue to drain 12033 1726867200.03506: waiting for pending results... 12033 1726867200.03972: running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services 12033 1726867200.03979: in run() - task 0affcac9-a3a5-74bb-502b-0000000006df 12033 1726867200.03983: variable 'ansible_search_path' from source: unknown 12033 1726867200.03986: variable 'ansible_search_path' from source: unknown 12033 1726867200.03996: calling self._execute() 12033 1726867200.04114: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.04127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.04142: variable 'omit' from source: magic vars 12033 1726867200.04538: variable 'ansible_distribution_major_version' from source: facts 12033 1726867200.04558: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867200.04569: variable 'omit' from source: magic vars 12033 1726867200.04632: variable 'omit' from source: magic vars 12033 1726867200.04671: variable 'omit' from source: magic vars 12033 1726867200.04724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867200.04764: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867200.04833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867200.04837: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.04840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.04870: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867200.04881: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.04890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.05128: Set connection var ansible_pipelining to False 12033 1726867200.05289: Set connection var ansible_shell_executable to /bin/sh 12033 1726867200.05292: Set connection var ansible_timeout to 10 12033 1726867200.05294: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867200.05376: Set connection var ansible_connection to ssh 12033 1726867200.05382: Set connection var ansible_shell_type to sh 12033 1726867200.05384: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.05387: variable 'ansible_connection' from source: unknown 12033 1726867200.05390: variable 'ansible_module_compression' from source: unknown 12033 1726867200.05391: variable 'ansible_shell_type' from source: unknown 12033 1726867200.05393: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.05397: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.05399: variable 'ansible_pipelining' from source: unknown 12033 1726867200.05401: variable 'ansible_timeout' from source: unknown 12033 1726867200.05403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.05899: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867200.06007: variable 'omit' from source: magic vars 12033 1726867200.06010: starting attempt loop 12033 1726867200.06012: running the handler 12033 1726867200.06015: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867200.06017: _low_level_execute_command(): starting 12033 1726867200.06019: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867200.07489: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867200.07675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867200.07697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867200.07778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867200.09435: stdout chunk (state=3): >>>/root <<< 12033 1726867200.09561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867200.09565: stderr chunk (state=3): >>><<< 12033 1726867200.09570: stdout chunk (state=3): >>><<< 12033 1726867200.09597: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867200.09610: _low_level_execute_command(): starting 12033 1726867200.09685: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867200.0959504-13981-42751273422111 `" && echo ansible-tmp-1726867200.0959504-13981-42751273422111="` echo /root/.ansible/tmp/ansible-tmp-1726867200.0959504-13981-42751273422111 `" ) && sleep 0' 12033 1726867200.11319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867200.11671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867200.11782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867200.13670: stdout chunk (state=3): >>>ansible-tmp-1726867200.0959504-13981-42751273422111=/root/.ansible/tmp/ansible-tmp-1726867200.0959504-13981-42751273422111 <<< 12033 1726867200.13774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867200.13824: stderr chunk (state=3): >>><<< 12033 1726867200.13828: stdout chunk (state=3): >>><<< 12033 1726867200.13984: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867200.0959504-13981-42751273422111=/root/.ansible/tmp/ansible-tmp-1726867200.0959504-13981-42751273422111 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867200.13987: variable 'ansible_module_compression' from source: unknown 12033 1726867200.13989: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867200.13991: variable 'ansible_facts' from source: unknown 12033 1726867200.14190: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867200.0959504-13981-42751273422111/AnsiballZ_command.py 12033 1726867200.14483: Sending initial data 12033 1726867200.14501: Sent initial data (155 bytes) 12033 1726867200.15780: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867200.15879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867200.15886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867200.15906: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867200.15911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867200.15925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867200.16116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867200.16119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867200.16194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867200.16203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867200.17736: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867200.17798: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867200.17958: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpjm8e910i /root/.ansible/tmp/ansible-tmp-1726867200.0959504-13981-42751273422111/AnsiballZ_command.py <<< 12033 1726867200.17962: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867200.0959504-13981-42751273422111/AnsiballZ_command.py" <<< 12033 1726867200.18004: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpjm8e910i" to remote "/root/.ansible/tmp/ansible-tmp-1726867200.0959504-13981-42751273422111/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867200.0959504-13981-42751273422111/AnsiballZ_command.py" <<< 12033 1726867200.19182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867200.19196: stderr chunk (state=3): >>><<< 12033 1726867200.19205: stdout chunk (state=3): >>><<< 12033 1726867200.19261: done transferring module to remote 12033 1726867200.19390: _low_level_execute_command(): starting 12033 1726867200.19397: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867200.0959504-13981-42751273422111/ /root/.ansible/tmp/ansible-tmp-1726867200.0959504-13981-42751273422111/AnsiballZ_command.py && sleep 0' 12033 1726867200.20440: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867200.20533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867200.20671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867200.20735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867200.22527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867200.22532: stderr chunk (state=3): >>><<< 12033 1726867200.22535: stdout chunk (state=3): >>><<< 12033 1726867200.22551: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867200.22554: _low_level_execute_command(): starting 12033 1726867200.22559: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867200.0959504-13981-42751273422111/AnsiballZ_command.py && sleep 0' 12033 1726867200.23111: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867200.23120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867200.23182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867200.23186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867200.23188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867200.23194: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867200.23198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867200.23203: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867200.23205: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867200.23207: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867200.23209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867200.23212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867200.23227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867200.23230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867200.23364: stderr chunk (state=3): >>>debug2: match found <<< 12033 1726867200.23367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867200.23369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867200.23371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867200.23373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867200.23417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867200.41182: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 17:20:00.382565", "end": "2024-09-20 17:20:00.409012", "delta": "0:00:00.026447", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867200.42726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867200.42750: stderr chunk (state=3): >>><<< 12033 1726867200.42753: stdout chunk (state=3): >>><<< 12033 1726867200.42770: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 17:20:00.382565", "end": "2024-09-20 17:20:00.409012", "delta": "0:00:00.026447", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867200.42808: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867200.0959504-13981-42751273422111/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867200.42815: _low_level_execute_command(): starting 12033 1726867200.42820: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867200.0959504-13981-42751273422111/ > /dev/null 2>&1 && sleep 0' 12033 1726867200.43241: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867200.43280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867200.43284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867200.43287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867200.43290: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867200.43292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867200.43338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867200.43341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867200.43345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867200.43390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867200.45194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867200.45215: stderr chunk (state=3): >>><<< 12033 1726867200.45219: stdout chunk (state=3): >>><<< 12033 1726867200.45234: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867200.45243: handler run complete 12033 1726867200.45261: Evaluated conditional (False): False 12033 1726867200.45278: attempt loop complete, returning result 12033 1726867200.45281: _execute() done 12033 1726867200.45284: dumping result to json 12033 1726867200.45286: done dumping result, returning 12033 1726867200.45305: done running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services [0affcac9-a3a5-74bb-502b-0000000006df] 12033 1726867200.45308: sending task result for task 0affcac9-a3a5-74bb-502b-0000000006df 12033 1726867200.45460: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000006df 12033 1726867200.45463: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.026447", "end": "2024-09-20 17:20:00.409012", "rc": 0, "start": "2024-09-20 17:20:00.382565" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 12033 1726867200.45552: no more pending results, returning what we have 12033 1726867200.45556: results queue empty 12033 1726867200.45557: checking for any_errors_fatal 12033 1726867200.45566: done checking for any_errors_fatal 12033 1726867200.45567: checking for max_fail_percentage 12033 1726867200.45568: done checking for max_fail_percentage 12033 1726867200.45569: checking to see if all hosts have failed and the running result is not ok 12033 1726867200.45570: done checking to see if all hosts have failed 12033 1726867200.45571: getting the remaining hosts for this loop 12033 1726867200.45572: done getting the remaining hosts for this loop 12033 1726867200.45576: getting the next task for host managed_node3 12033 1726867200.45589: done getting next task for host managed_node3 12033 1726867200.45591: ^ task is: TASK: Reset bond options to assert 12033 1726867200.45593: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867200.45597: getting variables 12033 1726867200.45599: in VariableManager get_vars() 12033 1726867200.45641: Calling all_inventory to load vars for managed_node3 12033 1726867200.45644: Calling groups_inventory to load vars for managed_node3 12033 1726867200.45646: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867200.45655: Calling all_plugins_play to load vars for managed_node3 12033 1726867200.45658: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867200.45660: Calling groups_plugins_play to load vars for managed_node3 12033 1726867200.50609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867200.51523: done with get_vars() 12033 1726867200.51555: done getting variables 12033 1726867200.51611: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Reset bond options to assert] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:59 Friday 20 September 2024 17:20:00 -0400 (0:00:00.490) 0:00:39.632 ****** 12033 1726867200.51636: entering _queue_task() for managed_node3/set_fact 12033 1726867200.52005: worker is 1 (out of 1 available) 12033 1726867200.52019: exiting _queue_task() for managed_node3/set_fact 12033 1726867200.52032: done queuing things up, now waiting for results queue to drain 12033 1726867200.52033: waiting for pending results... 12033 1726867200.52512: running TaskExecutor() for managed_node3/TASK: Reset bond options to assert 12033 1726867200.52543: in run() - task 0affcac9-a3a5-74bb-502b-00000000000f 12033 1726867200.52547: variable 'ansible_search_path' from source: unknown 12033 1726867200.52552: calling self._execute() 12033 1726867200.52718: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.52722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.52725: variable 'omit' from source: magic vars 12033 1726867200.53122: variable 'ansible_distribution_major_version' from source: facts 12033 1726867200.53146: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867200.53150: variable 'omit' from source: magic vars 12033 1726867200.53205: variable 'omit' from source: magic vars 12033 1726867200.53225: variable 'dhcp_interface1' from source: play vars 12033 1726867200.53306: variable 'dhcp_interface1' from source: play vars 12033 1726867200.53367: variable 'omit' from source: magic vars 12033 1726867200.53370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867200.53412: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867200.53433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867200.53452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.53476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.53597: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867200.53601: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.53604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.53631: Set connection var ansible_pipelining to False 12033 1726867200.53642: Set connection var ansible_shell_executable to /bin/sh 12033 1726867200.53651: Set connection var ansible_timeout to 10 12033 1726867200.53657: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867200.53660: Set connection var ansible_connection to ssh 12033 1726867200.53665: Set connection var ansible_shell_type to sh 12033 1726867200.53698: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.53702: variable 'ansible_connection' from source: unknown 12033 1726867200.53705: variable 'ansible_module_compression' from source: unknown 12033 1726867200.53707: variable 'ansible_shell_type' from source: unknown 12033 1726867200.53709: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.53711: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.53714: variable 'ansible_pipelining' from source: unknown 12033 1726867200.53717: variable 'ansible_timeout' from source: unknown 12033 1726867200.53726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.53922: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867200.53926: variable 'omit' from source: magic vars 12033 1726867200.53929: starting attempt loop 12033 1726867200.53931: running the handler 12033 1726867200.53933: handler run complete 12033 1726867200.53936: attempt loop complete, returning result 12033 1726867200.53944: _execute() done 12033 1726867200.53946: dumping result to json 12033 1726867200.53949: done dumping result, returning 12033 1726867200.53951: done running TaskExecutor() for managed_node3/TASK: Reset bond options to assert [0affcac9-a3a5-74bb-502b-00000000000f] 12033 1726867200.53953: sending task result for task 0affcac9-a3a5-74bb-502b-00000000000f 12033 1726867200.54091: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000000f 12033 1726867200.54095: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "bond_options_to_assert": [ { "key": "mode", "value": "active-backup" }, { "key": "arp_interval", "value": "60" }, { "key": "arp_ip_target", "value": "192.0.2.128" }, { "key": "arp_validate", "value": "none" }, { "key": "primary", "value": "test1" } ] }, "changed": false } 12033 1726867200.54344: no more pending results, returning what we have 12033 1726867200.54352: results queue empty 12033 1726867200.54353: checking for any_errors_fatal 12033 1726867200.54368: done checking for any_errors_fatal 12033 1726867200.54369: checking for max_fail_percentage 12033 1726867200.54372: done checking for max_fail_percentage 12033 1726867200.54373: checking to see if all hosts have failed and the running result is not ok 12033 1726867200.54373: done checking to see if all hosts have failed 12033 1726867200.54374: getting the remaining hosts for this loop 12033 1726867200.54376: done getting the remaining hosts for this loop 12033 1726867200.54384: getting the next task for host managed_node3 12033 1726867200.54393: done getting next task for host managed_node3 12033 1726867200.54396: ^ task is: TASK: Include the task 'run_test.yml' 12033 1726867200.54399: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867200.54402: getting variables 12033 1726867200.54403: in VariableManager get_vars() 12033 1726867200.54436: Calling all_inventory to load vars for managed_node3 12033 1726867200.54439: Calling groups_inventory to load vars for managed_node3 12033 1726867200.54441: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867200.54450: Calling all_plugins_play to load vars for managed_node3 12033 1726867200.54453: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867200.54456: Calling groups_plugins_play to load vars for managed_node3 12033 1726867200.55913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867200.57699: done with get_vars() 12033 1726867200.57730: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:72 Friday 20 September 2024 17:20:00 -0400 (0:00:00.062) 0:00:39.694 ****** 12033 1726867200.57860: entering _queue_task() for managed_node3/include_tasks 12033 1726867200.58246: worker is 1 (out of 1 available) 12033 1726867200.58260: exiting _queue_task() for managed_node3/include_tasks 12033 1726867200.58276: done queuing things up, now waiting for results queue to drain 12033 1726867200.58280: waiting for pending results... 12033 1726867200.58748: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 12033 1726867200.58847: in run() - task 0affcac9-a3a5-74bb-502b-000000000011 12033 1726867200.58852: variable 'ansible_search_path' from source: unknown 12033 1726867200.58855: calling self._execute() 12033 1726867200.58940: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.58948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.58960: variable 'omit' from source: magic vars 12033 1726867200.59418: variable 'ansible_distribution_major_version' from source: facts 12033 1726867200.59434: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867200.59439: _execute() done 12033 1726867200.59443: dumping result to json 12033 1726867200.59445: done dumping result, returning 12033 1726867200.59452: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [0affcac9-a3a5-74bb-502b-000000000011] 12033 1726867200.59457: sending task result for task 0affcac9-a3a5-74bb-502b-000000000011 12033 1726867200.59572: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000011 12033 1726867200.59576: WORKER PROCESS EXITING 12033 1726867200.59609: no more pending results, returning what we have 12033 1726867200.59615: in VariableManager get_vars() 12033 1726867200.59661: Calling all_inventory to load vars for managed_node3 12033 1726867200.59664: Calling groups_inventory to load vars for managed_node3 12033 1726867200.59666: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867200.59685: Calling all_plugins_play to load vars for managed_node3 12033 1726867200.59688: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867200.59693: Calling groups_plugins_play to load vars for managed_node3 12033 1726867200.61436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867200.62297: done with get_vars() 12033 1726867200.62312: variable 'ansible_search_path' from source: unknown 12033 1726867200.62325: we have included files to process 12033 1726867200.62326: generating all_blocks data 12033 1726867200.62330: done generating all_blocks data 12033 1726867200.62335: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 12033 1726867200.62336: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 12033 1726867200.62338: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 12033 1726867200.62618: in VariableManager get_vars() 12033 1726867200.62633: done with get_vars() 12033 1726867200.62659: in VariableManager get_vars() 12033 1726867200.62671: done with get_vars() 12033 1726867200.62700: in VariableManager get_vars() 12033 1726867200.62713: done with get_vars() 12033 1726867200.62742: in VariableManager get_vars() 12033 1726867200.62755: done with get_vars() 12033 1726867200.62780: in VariableManager get_vars() 12033 1726867200.62793: done with get_vars() 12033 1726867200.63066: in VariableManager get_vars() 12033 1726867200.63081: done with get_vars() 12033 1726867200.63089: done processing included file 12033 1726867200.63091: iterating over new_blocks loaded from include file 12033 1726867200.63091: in VariableManager get_vars() 12033 1726867200.63101: done with get_vars() 12033 1726867200.63102: filtering new block on tags 12033 1726867200.63167: done filtering new block on tags 12033 1726867200.63169: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 12033 1726867200.63173: extending task lists for all hosts with included blocks 12033 1726867200.63199: done extending task lists 12033 1726867200.63200: done processing included files 12033 1726867200.63201: results queue empty 12033 1726867200.63201: checking for any_errors_fatal 12033 1726867200.63203: done checking for any_errors_fatal 12033 1726867200.63204: checking for max_fail_percentage 12033 1726867200.63204: done checking for max_fail_percentage 12033 1726867200.63205: checking to see if all hosts have failed and the running result is not ok 12033 1726867200.63206: done checking to see if all hosts have failed 12033 1726867200.63206: getting the remaining hosts for this loop 12033 1726867200.63207: done getting the remaining hosts for this loop 12033 1726867200.63208: getting the next task for host managed_node3 12033 1726867200.63211: done getting next task for host managed_node3 12033 1726867200.63212: ^ task is: TASK: TEST: {{ lsr_description }} 12033 1726867200.63214: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867200.63215: getting variables 12033 1726867200.63216: in VariableManager get_vars() 12033 1726867200.63223: Calling all_inventory to load vars for managed_node3 12033 1726867200.63224: Calling groups_inventory to load vars for managed_node3 12033 1726867200.63226: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867200.63230: Calling all_plugins_play to load vars for managed_node3 12033 1726867200.63231: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867200.63233: Calling groups_plugins_play to load vars for managed_node3 12033 1726867200.63926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867200.65319: done with get_vars() 12033 1726867200.65341: done getting variables 12033 1726867200.65390: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867200.65507: variable 'lsr_description' from source: include params TASK [TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 17:20:00 -0400 (0:00:00.076) 0:00:39.771 ****** 12033 1726867200.65538: entering _queue_task() for managed_node3/debug 12033 1726867200.65907: worker is 1 (out of 1 available) 12033 1726867200.65922: exiting _queue_task() for managed_node3/debug 12033 1726867200.65935: done queuing things up, now waiting for results queue to drain 12033 1726867200.65937: waiting for pending results... 12033 1726867200.66324: running TaskExecutor() for managed_node3/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. 12033 1726867200.66386: in run() - task 0affcac9-a3a5-74bb-502b-0000000008ea 12033 1726867200.66394: variable 'ansible_search_path' from source: unknown 12033 1726867200.66397: variable 'ansible_search_path' from source: unknown 12033 1726867200.66435: calling self._execute() 12033 1726867200.66544: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.66548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.66559: variable 'omit' from source: magic vars 12033 1726867200.66961: variable 'ansible_distribution_major_version' from source: facts 12033 1726867200.66974: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867200.66979: variable 'omit' from source: magic vars 12033 1726867200.67065: variable 'omit' from source: magic vars 12033 1726867200.67117: variable 'lsr_description' from source: include params 12033 1726867200.67137: variable 'omit' from source: magic vars 12033 1726867200.67183: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867200.67218: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867200.67239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867200.67259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.67282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.67386: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867200.67389: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.67395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.67428: Set connection var ansible_pipelining to False 12033 1726867200.67495: Set connection var ansible_shell_executable to /bin/sh 12033 1726867200.67499: Set connection var ansible_timeout to 10 12033 1726867200.67502: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867200.67510: Set connection var ansible_connection to ssh 12033 1726867200.67513: Set connection var ansible_shell_type to sh 12033 1726867200.67515: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.67518: variable 'ansible_connection' from source: unknown 12033 1726867200.67522: variable 'ansible_module_compression' from source: unknown 12033 1726867200.67525: variable 'ansible_shell_type' from source: unknown 12033 1726867200.67527: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.67529: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.67531: variable 'ansible_pipelining' from source: unknown 12033 1726867200.67533: variable 'ansible_timeout' from source: unknown 12033 1726867200.67536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.67718: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867200.67726: variable 'omit' from source: magic vars 12033 1726867200.67729: starting attempt loop 12033 1726867200.67732: running the handler 12033 1726867200.67735: handler run complete 12033 1726867200.67737: attempt loop complete, returning result 12033 1726867200.67739: _execute() done 12033 1726867200.67741: dumping result to json 12033 1726867200.67743: done dumping result, returning 12033 1726867200.67746: done running TaskExecutor() for managed_node3/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. [0affcac9-a3a5-74bb-502b-0000000008ea] 12033 1726867200.67758: sending task result for task 0affcac9-a3a5-74bb-502b-0000000008ea 12033 1726867200.67906: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000008ea 12033 1726867200.67909: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ########## Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. ########## 12033 1726867200.68026: no more pending results, returning what we have 12033 1726867200.68030: results queue empty 12033 1726867200.68031: checking for any_errors_fatal 12033 1726867200.68032: done checking for any_errors_fatal 12033 1726867200.68033: checking for max_fail_percentage 12033 1726867200.68035: done checking for max_fail_percentage 12033 1726867200.68036: checking to see if all hosts have failed and the running result is not ok 12033 1726867200.68037: done checking to see if all hosts have failed 12033 1726867200.68038: getting the remaining hosts for this loop 12033 1726867200.68040: done getting the remaining hosts for this loop 12033 1726867200.68044: getting the next task for host managed_node3 12033 1726867200.68051: done getting next task for host managed_node3 12033 1726867200.68053: ^ task is: TASK: Show item 12033 1726867200.68056: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867200.68063: getting variables 12033 1726867200.68065: in VariableManager get_vars() 12033 1726867200.68106: Calling all_inventory to load vars for managed_node3 12033 1726867200.68109: Calling groups_inventory to load vars for managed_node3 12033 1726867200.68112: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867200.68124: Calling all_plugins_play to load vars for managed_node3 12033 1726867200.68127: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867200.68130: Calling groups_plugins_play to load vars for managed_node3 12033 1726867200.69547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867200.71057: done with get_vars() 12033 1726867200.71083: done getting variables 12033 1726867200.71146: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 17:20:00 -0400 (0:00:00.056) 0:00:39.827 ****** 12033 1726867200.71186: entering _queue_task() for managed_node3/debug 12033 1726867200.71533: worker is 1 (out of 1 available) 12033 1726867200.71546: exiting _queue_task() for managed_node3/debug 12033 1726867200.71560: done queuing things up, now waiting for results queue to drain 12033 1726867200.71561: waiting for pending results... 12033 1726867200.72104: running TaskExecutor() for managed_node3/TASK: Show item 12033 1726867200.72115: in run() - task 0affcac9-a3a5-74bb-502b-0000000008eb 12033 1726867200.72119: variable 'ansible_search_path' from source: unknown 12033 1726867200.72122: variable 'ansible_search_path' from source: unknown 12033 1726867200.72125: variable 'omit' from source: magic vars 12033 1726867200.72386: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.72390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.72396: variable 'omit' from source: magic vars 12033 1726867200.72602: variable 'ansible_distribution_major_version' from source: facts 12033 1726867200.72615: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867200.72621: variable 'omit' from source: magic vars 12033 1726867200.72781: variable 'omit' from source: magic vars 12033 1726867200.72785: variable 'item' from source: unknown 12033 1726867200.72788: variable 'item' from source: unknown 12033 1726867200.72791: variable 'omit' from source: magic vars 12033 1726867200.72830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867200.72866: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867200.72888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867200.72906: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.72917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.72948: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867200.72958: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.72961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.73053: Set connection var ansible_pipelining to False 12033 1726867200.73065: Set connection var ansible_shell_executable to /bin/sh 12033 1726867200.73069: Set connection var ansible_timeout to 10 12033 1726867200.73073: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867200.73076: Set connection var ansible_connection to ssh 12033 1726867200.73083: Set connection var ansible_shell_type to sh 12033 1726867200.73108: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.73112: variable 'ansible_connection' from source: unknown 12033 1726867200.73114: variable 'ansible_module_compression' from source: unknown 12033 1726867200.73117: variable 'ansible_shell_type' from source: unknown 12033 1726867200.73119: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.73121: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.73126: variable 'ansible_pipelining' from source: unknown 12033 1726867200.73129: variable 'ansible_timeout' from source: unknown 12033 1726867200.73133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.73272: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867200.73285: variable 'omit' from source: magic vars 12033 1726867200.73288: starting attempt loop 12033 1726867200.73296: running the handler 12033 1726867200.73345: variable 'lsr_description' from source: include params 12033 1726867200.73407: variable 'lsr_description' from source: include params 12033 1726867200.73421: handler run complete 12033 1726867200.73437: attempt loop complete, returning result 12033 1726867200.73450: variable 'item' from source: unknown 12033 1726867200.73507: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device." } 12033 1726867200.73836: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.73840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.73843: variable 'omit' from source: magic vars 12033 1726867200.73847: variable 'ansible_distribution_major_version' from source: facts 12033 1726867200.73849: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867200.73852: variable 'omit' from source: magic vars 12033 1726867200.73883: variable 'omit' from source: magic vars 12033 1726867200.74048: variable 'item' from source: unknown 12033 1726867200.74052: variable 'item' from source: unknown 12033 1726867200.74054: variable 'omit' from source: magic vars 12033 1726867200.74057: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867200.74059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.74062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.74078: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867200.74086: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.74096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.74185: Set connection var ansible_pipelining to False 12033 1726867200.74201: Set connection var ansible_shell_executable to /bin/sh 12033 1726867200.74215: Set connection var ansible_timeout to 10 12033 1726867200.74225: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867200.74231: Set connection var ansible_connection to ssh 12033 1726867200.74240: Set connection var ansible_shell_type to sh 12033 1726867200.74271: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.74285: variable 'ansible_connection' from source: unknown 12033 1726867200.74296: variable 'ansible_module_compression' from source: unknown 12033 1726867200.74304: variable 'ansible_shell_type' from source: unknown 12033 1726867200.74311: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.74376: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.74381: variable 'ansible_pipelining' from source: unknown 12033 1726867200.74384: variable 'ansible_timeout' from source: unknown 12033 1726867200.74386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.74445: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867200.74460: variable 'omit' from source: magic vars 12033 1726867200.74469: starting attempt loop 12033 1726867200.74483: running the handler 12033 1726867200.74517: variable 'lsr_setup' from source: include params 12033 1726867200.74599: variable 'lsr_setup' from source: include params 12033 1726867200.74654: handler run complete 12033 1726867200.74704: attempt loop complete, returning result 12033 1726867200.74722: variable 'item' from source: unknown 12033 1726867200.74812: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_test_interfaces_with_dhcp.yml", "tasks/assert_dhcp_device_present.yml" ] } 12033 1726867200.75138: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.75142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.75144: variable 'omit' from source: magic vars 12033 1726867200.75179: variable 'ansible_distribution_major_version' from source: facts 12033 1726867200.75191: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867200.75203: variable 'omit' from source: magic vars 12033 1726867200.75221: variable 'omit' from source: magic vars 12033 1726867200.75275: variable 'item' from source: unknown 12033 1726867200.75344: variable 'item' from source: unknown 12033 1726867200.75375: variable 'omit' from source: magic vars 12033 1726867200.75405: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867200.75419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.75463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.75467: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867200.75476: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.75481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.75549: Set connection var ansible_pipelining to False 12033 1726867200.75563: Set connection var ansible_shell_executable to /bin/sh 12033 1726867200.75680: Set connection var ansible_timeout to 10 12033 1726867200.75684: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867200.75688: Set connection var ansible_connection to ssh 12033 1726867200.75690: Set connection var ansible_shell_type to sh 12033 1726867200.75694: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.75697: variable 'ansible_connection' from source: unknown 12033 1726867200.75698: variable 'ansible_module_compression' from source: unknown 12033 1726867200.75700: variable 'ansible_shell_type' from source: unknown 12033 1726867200.75702: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.75704: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.75706: variable 'ansible_pipelining' from source: unknown 12033 1726867200.75708: variable 'ansible_timeout' from source: unknown 12033 1726867200.75710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.75882: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867200.75887: variable 'omit' from source: magic vars 12033 1726867200.75890: starting attempt loop 12033 1726867200.75896: running the handler 12033 1726867200.75899: variable 'lsr_test' from source: include params 12033 1726867200.75915: variable 'lsr_test' from source: include params 12033 1726867200.75948: handler run complete 12033 1726867200.75965: attempt loop complete, returning result 12033 1726867200.75985: variable 'item' from source: unknown 12033 1726867200.76055: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bond_profile_reconfigure.yml" ] } 12033 1726867200.76388: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.76391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.76396: variable 'omit' from source: magic vars 12033 1726867200.76398: variable 'ansible_distribution_major_version' from source: facts 12033 1726867200.76399: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867200.76407: variable 'omit' from source: magic vars 12033 1726867200.76422: variable 'omit' from source: magic vars 12033 1726867200.76460: variable 'item' from source: unknown 12033 1726867200.76533: variable 'item' from source: unknown 12033 1726867200.76551: variable 'omit' from source: magic vars 12033 1726867200.76606: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867200.76609: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.76611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.76612: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867200.76618: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.76624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.76700: Set connection var ansible_pipelining to False 12033 1726867200.76723: Set connection var ansible_shell_executable to /bin/sh 12033 1726867200.76783: Set connection var ansible_timeout to 10 12033 1726867200.76786: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867200.76789: Set connection var ansible_connection to ssh 12033 1726867200.76791: Set connection var ansible_shell_type to sh 12033 1726867200.76795: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.76798: variable 'ansible_connection' from source: unknown 12033 1726867200.76801: variable 'ansible_module_compression' from source: unknown 12033 1726867200.76803: variable 'ansible_shell_type' from source: unknown 12033 1726867200.76806: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.76808: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.76811: variable 'ansible_pipelining' from source: unknown 12033 1726867200.76829: variable 'ansible_timeout' from source: unknown 12033 1726867200.76839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.76984: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867200.76987: variable 'omit' from source: magic vars 12033 1726867200.76989: starting attempt loop 12033 1726867200.76994: running the handler 12033 1726867200.76997: variable 'lsr_assert' from source: include params 12033 1726867200.77062: variable 'lsr_assert' from source: include params 12033 1726867200.77086: handler run complete 12033 1726867200.77107: attempt loop complete, returning result 12033 1726867200.77126: variable 'item' from source: unknown 12033 1726867200.77209: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_bond_options.yml" ] } 12033 1726867200.77472: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.77483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.77486: variable 'omit' from source: magic vars 12033 1726867200.77682: variable 'ansible_distribution_major_version' from source: facts 12033 1726867200.77689: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867200.77694: variable 'omit' from source: magic vars 12033 1726867200.77697: variable 'omit' from source: magic vars 12033 1726867200.77781: variable 'item' from source: unknown 12033 1726867200.77812: variable 'item' from source: unknown 12033 1726867200.78162: variable 'omit' from source: magic vars 12033 1726867200.78165: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867200.78168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.78170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.78172: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867200.78174: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.78176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.78179: Set connection var ansible_pipelining to False 12033 1726867200.78182: Set connection var ansible_shell_executable to /bin/sh 12033 1726867200.78184: Set connection var ansible_timeout to 10 12033 1726867200.78186: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867200.78188: Set connection var ansible_connection to ssh 12033 1726867200.78190: Set connection var ansible_shell_type to sh 12033 1726867200.78194: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.78196: variable 'ansible_connection' from source: unknown 12033 1726867200.78197: variable 'ansible_module_compression' from source: unknown 12033 1726867200.78199: variable 'ansible_shell_type' from source: unknown 12033 1726867200.78201: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.78202: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.78204: variable 'ansible_pipelining' from source: unknown 12033 1726867200.78207: variable 'ansible_timeout' from source: unknown 12033 1726867200.78209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.78309: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867200.78326: variable 'omit' from source: magic vars 12033 1726867200.78335: starting attempt loop 12033 1726867200.78342: running the handler 12033 1726867200.78462: handler run complete 12033 1726867200.78487: attempt loop complete, returning result 12033 1726867200.78509: variable 'item' from source: unknown 12033 1726867200.78572: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 12033 1726867200.78812: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.78815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.78817: variable 'omit' from source: magic vars 12033 1726867200.78932: variable 'ansible_distribution_major_version' from source: facts 12033 1726867200.78944: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867200.78952: variable 'omit' from source: magic vars 12033 1726867200.78970: variable 'omit' from source: magic vars 12033 1726867200.79035: variable 'item' from source: unknown 12033 1726867200.79108: variable 'item' from source: unknown 12033 1726867200.79127: variable 'omit' from source: magic vars 12033 1726867200.79154: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867200.79166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.79248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.79251: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867200.79254: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.79256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.79288: Set connection var ansible_pipelining to False 12033 1726867200.79304: Set connection var ansible_shell_executable to /bin/sh 12033 1726867200.79316: Set connection var ansible_timeout to 10 12033 1726867200.79325: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867200.79332: Set connection var ansible_connection to ssh 12033 1726867200.79341: Set connection var ansible_shell_type to sh 12033 1726867200.79372: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.79383: variable 'ansible_connection' from source: unknown 12033 1726867200.79394: variable 'ansible_module_compression' from source: unknown 12033 1726867200.79402: variable 'ansible_shell_type' from source: unknown 12033 1726867200.79409: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.79416: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.79424: variable 'ansible_pipelining' from source: unknown 12033 1726867200.79466: variable 'ansible_timeout' from source: unknown 12033 1726867200.79469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.79537: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867200.79549: variable 'omit' from source: magic vars 12033 1726867200.79558: starting attempt loop 12033 1726867200.79564: running the handler 12033 1726867200.79690: variable 'lsr_fail_debug' from source: play vars 12033 1726867200.79693: variable 'lsr_fail_debug' from source: play vars 12033 1726867200.79695: handler run complete 12033 1726867200.79697: attempt loop complete, returning result 12033 1726867200.79719: variable 'item' from source: unknown 12033 1726867200.79832: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 12033 1726867200.80032: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.80038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.80049: variable 'omit' from source: magic vars 12033 1726867200.80315: variable 'ansible_distribution_major_version' from source: facts 12033 1726867200.80318: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867200.80323: variable 'omit' from source: magic vars 12033 1726867200.80338: variable 'omit' from source: magic vars 12033 1726867200.80375: variable 'item' from source: unknown 12033 1726867200.80657: variable 'item' from source: unknown 12033 1726867200.80671: variable 'omit' from source: magic vars 12033 1726867200.80690: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867200.80701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.80708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867200.80717: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867200.80720: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.80723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.80898: Set connection var ansible_pipelining to False 12033 1726867200.80901: Set connection var ansible_shell_executable to /bin/sh 12033 1726867200.80903: Set connection var ansible_timeout to 10 12033 1726867200.80908: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867200.80911: Set connection var ansible_connection to ssh 12033 1726867200.80918: Set connection var ansible_shell_type to sh 12033 1726867200.80934: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.80937: variable 'ansible_connection' from source: unknown 12033 1726867200.80940: variable 'ansible_module_compression' from source: unknown 12033 1726867200.80942: variable 'ansible_shell_type' from source: unknown 12033 1726867200.80944: variable 'ansible_shell_executable' from source: unknown 12033 1726867200.80946: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.81035: variable 'ansible_pipelining' from source: unknown 12033 1726867200.81038: variable 'ansible_timeout' from source: unknown 12033 1726867200.81044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.81132: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867200.81139: variable 'omit' from source: magic vars 12033 1726867200.81143: starting attempt loop 12033 1726867200.81146: running the handler 12033 1726867200.81163: variable 'lsr_cleanup' from source: include params 12033 1726867200.81244: variable 'lsr_cleanup' from source: include params 12033 1726867200.81270: handler run complete 12033 1726867200.81393: attempt loop complete, returning result 12033 1726867200.81414: variable 'item' from source: unknown 12033 1726867200.81491: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_bond_profile+device.yml", "tasks/remove_test_interfaces_with_dhcp.yml", "tasks/check_network_dns.yml" ] } 12033 1726867200.81714: dumping result to json 12033 1726867200.81717: done dumping result, returning 12033 1726867200.81719: done running TaskExecutor() for managed_node3/TASK: Show item [0affcac9-a3a5-74bb-502b-0000000008eb] 12033 1726867200.81721: sending task result for task 0affcac9-a3a5-74bb-502b-0000000008eb 12033 1726867200.81773: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000008eb 12033 1726867200.81776: WORKER PROCESS EXITING 12033 1726867200.81867: no more pending results, returning what we have 12033 1726867200.81871: results queue empty 12033 1726867200.81873: checking for any_errors_fatal 12033 1726867200.81884: done checking for any_errors_fatal 12033 1726867200.81885: checking for max_fail_percentage 12033 1726867200.81887: done checking for max_fail_percentage 12033 1726867200.81888: checking to see if all hosts have failed and the running result is not ok 12033 1726867200.81889: done checking to see if all hosts have failed 12033 1726867200.81890: getting the remaining hosts for this loop 12033 1726867200.81892: done getting the remaining hosts for this loop 12033 1726867200.81896: getting the next task for host managed_node3 12033 1726867200.81902: done getting next task for host managed_node3 12033 1726867200.81905: ^ task is: TASK: Include the task 'show_interfaces.yml' 12033 1726867200.81909: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867200.81913: getting variables 12033 1726867200.81915: in VariableManager get_vars() 12033 1726867200.81956: Calling all_inventory to load vars for managed_node3 12033 1726867200.81959: Calling groups_inventory to load vars for managed_node3 12033 1726867200.81962: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867200.81973: Calling all_plugins_play to load vars for managed_node3 12033 1726867200.81976: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867200.81982: Calling groups_plugins_play to load vars for managed_node3 12033 1726867200.84008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867200.86995: done with get_vars() 12033 1726867200.87022: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 17:20:00 -0400 (0:00:00.160) 0:00:39.987 ****** 12033 1726867200.87215: entering _queue_task() for managed_node3/include_tasks 12033 1726867200.87554: worker is 1 (out of 1 available) 12033 1726867200.87566: exiting _queue_task() for managed_node3/include_tasks 12033 1726867200.87717: done queuing things up, now waiting for results queue to drain 12033 1726867200.87719: waiting for pending results... 12033 1726867200.87997: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 12033 1726867200.88002: in run() - task 0affcac9-a3a5-74bb-502b-0000000008ec 12033 1726867200.88017: variable 'ansible_search_path' from source: unknown 12033 1726867200.88024: variable 'ansible_search_path' from source: unknown 12033 1726867200.88265: calling self._execute() 12033 1726867200.88276: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867200.88295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867200.88313: variable 'omit' from source: magic vars 12033 1726867200.89157: variable 'ansible_distribution_major_version' from source: facts 12033 1726867200.89169: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867200.89174: _execute() done 12033 1726867200.89178: dumping result to json 12033 1726867200.89486: done dumping result, returning 12033 1726867200.89489: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0affcac9-a3a5-74bb-502b-0000000008ec] 12033 1726867200.89492: sending task result for task 0affcac9-a3a5-74bb-502b-0000000008ec 12033 1726867200.89563: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000008ec 12033 1726867200.89568: WORKER PROCESS EXITING 12033 1726867200.89601: no more pending results, returning what we have 12033 1726867200.89605: in VariableManager get_vars() 12033 1726867200.89643: Calling all_inventory to load vars for managed_node3 12033 1726867200.89646: Calling groups_inventory to load vars for managed_node3 12033 1726867200.89648: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867200.89657: Calling all_plugins_play to load vars for managed_node3 12033 1726867200.89659: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867200.89662: Calling groups_plugins_play to load vars for managed_node3 12033 1726867200.92553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867200.94451: done with get_vars() 12033 1726867200.94589: variable 'ansible_search_path' from source: unknown 12033 1726867200.94591: variable 'ansible_search_path' from source: unknown 12033 1726867200.94632: we have included files to process 12033 1726867200.94634: generating all_blocks data 12033 1726867200.94635: done generating all_blocks data 12033 1726867200.94640: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 12033 1726867200.94641: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 12033 1726867200.94643: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 12033 1726867200.94859: in VariableManager get_vars() 12033 1726867200.94885: done with get_vars() 12033 1726867200.95182: done processing included file 12033 1726867200.95185: iterating over new_blocks loaded from include file 12033 1726867200.95186: in VariableManager get_vars() 12033 1726867200.95206: done with get_vars() 12033 1726867200.95208: filtering new block on tags 12033 1726867200.95357: done filtering new block on tags 12033 1726867200.95360: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 12033 1726867200.95365: extending task lists for all hosts with included blocks 12033 1726867200.96067: done extending task lists 12033 1726867200.96069: done processing included files 12033 1726867200.96069: results queue empty 12033 1726867200.96070: checking for any_errors_fatal 12033 1726867200.96085: done checking for any_errors_fatal 12033 1726867200.96086: checking for max_fail_percentage 12033 1726867200.96088: done checking for max_fail_percentage 12033 1726867200.96089: checking to see if all hosts have failed and the running result is not ok 12033 1726867200.96089: done checking to see if all hosts have failed 12033 1726867200.96090: getting the remaining hosts for this loop 12033 1726867200.96094: done getting the remaining hosts for this loop 12033 1726867200.96097: getting the next task for host managed_node3 12033 1726867200.96101: done getting next task for host managed_node3 12033 1726867200.96103: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 12033 1726867200.96106: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867200.96109: getting variables 12033 1726867200.96110: in VariableManager get_vars() 12033 1726867200.96122: Calling all_inventory to load vars for managed_node3 12033 1726867200.96124: Calling groups_inventory to load vars for managed_node3 12033 1726867200.96126: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867200.96132: Calling all_plugins_play to load vars for managed_node3 12033 1726867200.96134: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867200.96137: Calling groups_plugins_play to load vars for managed_node3 12033 1726867200.97596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867201.00251: done with get_vars() 12033 1726867201.00274: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 17:20:01 -0400 (0:00:00.131) 0:00:40.119 ****** 12033 1726867201.00365: entering _queue_task() for managed_node3/include_tasks 12033 1726867201.00819: worker is 1 (out of 1 available) 12033 1726867201.00830: exiting _queue_task() for managed_node3/include_tasks 12033 1726867201.00844: done queuing things up, now waiting for results queue to drain 12033 1726867201.00845: waiting for pending results... 12033 1726867201.01061: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 12033 1726867201.01187: in run() - task 0affcac9-a3a5-74bb-502b-000000000913 12033 1726867201.01210: variable 'ansible_search_path' from source: unknown 12033 1726867201.01225: variable 'ansible_search_path' from source: unknown 12033 1726867201.01265: calling self._execute() 12033 1726867201.01381: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867201.01400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867201.01417: variable 'omit' from source: magic vars 12033 1726867201.01855: variable 'ansible_distribution_major_version' from source: facts 12033 1726867201.01883: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867201.01937: _execute() done 12033 1726867201.01940: dumping result to json 12033 1726867201.01942: done dumping result, returning 12033 1726867201.01945: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0affcac9-a3a5-74bb-502b-000000000913] 12033 1726867201.01948: sending task result for task 0affcac9-a3a5-74bb-502b-000000000913 12033 1726867201.02118: no more pending results, returning what we have 12033 1726867201.02124: in VariableManager get_vars() 12033 1726867201.02170: Calling all_inventory to load vars for managed_node3 12033 1726867201.02173: Calling groups_inventory to load vars for managed_node3 12033 1726867201.02176: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867201.02197: Calling all_plugins_play to load vars for managed_node3 12033 1726867201.02201: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867201.02205: Calling groups_plugins_play to load vars for managed_node3 12033 1726867201.03195: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000913 12033 1726867201.03199: WORKER PROCESS EXITING 12033 1726867201.05153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867201.08721: done with get_vars() 12033 1726867201.08741: variable 'ansible_search_path' from source: unknown 12033 1726867201.08743: variable 'ansible_search_path' from source: unknown 12033 1726867201.08794: we have included files to process 12033 1726867201.08796: generating all_blocks data 12033 1726867201.08797: done generating all_blocks data 12033 1726867201.08799: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 12033 1726867201.08800: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 12033 1726867201.08803: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 12033 1726867201.09111: done processing included file 12033 1726867201.09113: iterating over new_blocks loaded from include file 12033 1726867201.09115: in VariableManager get_vars() 12033 1726867201.09134: done with get_vars() 12033 1726867201.09136: filtering new block on tags 12033 1726867201.09172: done filtering new block on tags 12033 1726867201.09175: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 12033 1726867201.09183: extending task lists for all hosts with included blocks 12033 1726867201.09366: done extending task lists 12033 1726867201.09367: done processing included files 12033 1726867201.09368: results queue empty 12033 1726867201.09369: checking for any_errors_fatal 12033 1726867201.09371: done checking for any_errors_fatal 12033 1726867201.09372: checking for max_fail_percentage 12033 1726867201.09373: done checking for max_fail_percentage 12033 1726867201.09374: checking to see if all hosts have failed and the running result is not ok 12033 1726867201.09375: done checking to see if all hosts have failed 12033 1726867201.09376: getting the remaining hosts for this loop 12033 1726867201.09378: done getting the remaining hosts for this loop 12033 1726867201.09381: getting the next task for host managed_node3 12033 1726867201.09385: done getting next task for host managed_node3 12033 1726867201.09387: ^ task is: TASK: Gather current interface info 12033 1726867201.09390: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867201.09395: getting variables 12033 1726867201.09396: in VariableManager get_vars() 12033 1726867201.09407: Calling all_inventory to load vars for managed_node3 12033 1726867201.09409: Calling groups_inventory to load vars for managed_node3 12033 1726867201.09411: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867201.09416: Calling all_plugins_play to load vars for managed_node3 12033 1726867201.09418: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867201.09428: Calling groups_plugins_play to load vars for managed_node3 12033 1726867201.10662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867201.13008: done with get_vars() 12033 1726867201.13063: done getting variables 12033 1726867201.13143: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 17:20:01 -0400 (0:00:00.128) 0:00:40.247 ****** 12033 1726867201.13182: entering _queue_task() for managed_node3/command 12033 1726867201.13561: worker is 1 (out of 1 available) 12033 1726867201.13690: exiting _queue_task() for managed_node3/command 12033 1726867201.13706: done queuing things up, now waiting for results queue to drain 12033 1726867201.13708: waiting for pending results... 12033 1726867201.13928: running TaskExecutor() for managed_node3/TASK: Gather current interface info 12033 1726867201.14072: in run() - task 0affcac9-a3a5-74bb-502b-00000000094e 12033 1726867201.14098: variable 'ansible_search_path' from source: unknown 12033 1726867201.14107: variable 'ansible_search_path' from source: unknown 12033 1726867201.14150: calling self._execute() 12033 1726867201.14268: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867201.14282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867201.14300: variable 'omit' from source: magic vars 12033 1726867201.14717: variable 'ansible_distribution_major_version' from source: facts 12033 1726867201.14735: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867201.14745: variable 'omit' from source: magic vars 12033 1726867201.14812: variable 'omit' from source: magic vars 12033 1726867201.14851: variable 'omit' from source: magic vars 12033 1726867201.14904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867201.14950: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867201.14973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867201.15002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867201.15025: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867201.15063: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867201.15071: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867201.15080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867201.15201: Set connection var ansible_pipelining to False 12033 1726867201.15215: Set connection var ansible_shell_executable to /bin/sh 12033 1726867201.15227: Set connection var ansible_timeout to 10 12033 1726867201.15246: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867201.15309: Set connection var ansible_connection to ssh 12033 1726867201.15312: Set connection var ansible_shell_type to sh 12033 1726867201.15314: variable 'ansible_shell_executable' from source: unknown 12033 1726867201.15317: variable 'ansible_connection' from source: unknown 12033 1726867201.15319: variable 'ansible_module_compression' from source: unknown 12033 1726867201.15321: variable 'ansible_shell_type' from source: unknown 12033 1726867201.15323: variable 'ansible_shell_executable' from source: unknown 12033 1726867201.15325: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867201.15327: variable 'ansible_pipelining' from source: unknown 12033 1726867201.15329: variable 'ansible_timeout' from source: unknown 12033 1726867201.15332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867201.15510: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867201.15608: variable 'omit' from source: magic vars 12033 1726867201.15619: starting attempt loop 12033 1726867201.15626: running the handler 12033 1726867201.15650: _low_level_execute_command(): starting 12033 1726867201.15662: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867201.16448: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867201.16462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867201.16498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867201.16568: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867201.16608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867201.16624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867201.16642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867201.16801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867201.18502: stdout chunk (state=3): >>>/root <<< 12033 1726867201.18698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867201.18701: stdout chunk (state=3): >>><<< 12033 1726867201.18703: stderr chunk (state=3): >>><<< 12033 1726867201.18723: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867201.18745: _low_level_execute_command(): starting 12033 1726867201.18829: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867201.1873016-14030-25689814663061 `" && echo ansible-tmp-1726867201.1873016-14030-25689814663061="` echo /root/.ansible/tmp/ansible-tmp-1726867201.1873016-14030-25689814663061 `" ) && sleep 0' 12033 1726867201.19398: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867201.19485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867201.19490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867201.19495: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867201.19505: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867201.19508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867201.19510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867201.19512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867201.19553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867201.19557: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867201.19613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867201.21514: stdout chunk (state=3): >>>ansible-tmp-1726867201.1873016-14030-25689814663061=/root/.ansible/tmp/ansible-tmp-1726867201.1873016-14030-25689814663061 <<< 12033 1726867201.21670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867201.21674: stdout chunk (state=3): >>><<< 12033 1726867201.21678: stderr chunk (state=3): >>><<< 12033 1726867201.21698: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867201.1873016-14030-25689814663061=/root/.ansible/tmp/ansible-tmp-1726867201.1873016-14030-25689814663061 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867201.21802: variable 'ansible_module_compression' from source: unknown 12033 1726867201.21805: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867201.21983: variable 'ansible_facts' from source: unknown 12033 1726867201.21986: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867201.1873016-14030-25689814663061/AnsiballZ_command.py 12033 1726867201.22117: Sending initial data 12033 1726867201.22121: Sent initial data (155 bytes) 12033 1726867201.22607: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867201.22617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867201.22628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867201.22664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867201.22707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867201.22719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867201.22771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867201.24431: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867201.24471: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867201.24516: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmp9gh5nrup /root/.ansible/tmp/ansible-tmp-1726867201.1873016-14030-25689814663061/AnsiballZ_command.py <<< 12033 1726867201.24519: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867201.1873016-14030-25689814663061/AnsiballZ_command.py" <<< 12033 1726867201.24586: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmp9gh5nrup" to remote "/root/.ansible/tmp/ansible-tmp-1726867201.1873016-14030-25689814663061/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867201.1873016-14030-25689814663061/AnsiballZ_command.py" <<< 12033 1726867201.25675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867201.25684: stderr chunk (state=3): >>><<< 12033 1726867201.25689: stdout chunk (state=3): >>><<< 12033 1726867201.25713: done transferring module to remote 12033 1726867201.25723: _low_level_execute_command(): starting 12033 1726867201.25726: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867201.1873016-14030-25689814663061/ /root/.ansible/tmp/ansible-tmp-1726867201.1873016-14030-25689814663061/AnsiballZ_command.py && sleep 0' 12033 1726867201.26356: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867201.26383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867201.26386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867201.26461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867201.28324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867201.28364: stderr chunk (state=3): >>><<< 12033 1726867201.28373: stdout chunk (state=3): >>><<< 12033 1726867201.28394: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867201.28398: _low_level_execute_command(): starting 12033 1726867201.28401: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867201.1873016-14030-25689814663061/AnsiballZ_command.py && sleep 0' 12033 1726867201.28963: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867201.28980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867201.28983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867201.28999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867201.29085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867201.29088: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867201.29090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867201.29094: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867201.29097: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867201.29098: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867201.29100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867201.29102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867201.29119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867201.29196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867201.29200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867201.29203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867201.29254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867201.44851: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:20:01.442903", "end": "2024-09-20 17:20:01.446193", "delta": "0:00:00.003290", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867201.46338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867201.46367: stderr chunk (state=3): >>><<< 12033 1726867201.46372: stdout chunk (state=3): >>><<< 12033 1726867201.46391: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:20:01.442903", "end": "2024-09-20 17:20:01.446193", "delta": "0:00:00.003290", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867201.46422: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867201.1873016-14030-25689814663061/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867201.46430: _low_level_execute_command(): starting 12033 1726867201.46435: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867201.1873016-14030-25689814663061/ > /dev/null 2>&1 && sleep 0' 12033 1726867201.46889: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867201.46893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867201.46899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867201.46902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867201.46904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867201.46957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867201.46964: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867201.46966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867201.47007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867201.48829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867201.48856: stderr chunk (state=3): >>><<< 12033 1726867201.48859: stdout chunk (state=3): >>><<< 12033 1726867201.48871: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867201.48878: handler run complete 12033 1726867201.48897: Evaluated conditional (False): False 12033 1726867201.48906: attempt loop complete, returning result 12033 1726867201.48908: _execute() done 12033 1726867201.48911: dumping result to json 12033 1726867201.48915: done dumping result, returning 12033 1726867201.48922: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0affcac9-a3a5-74bb-502b-00000000094e] 12033 1726867201.48927: sending task result for task 0affcac9-a3a5-74bb-502b-00000000094e 12033 1726867201.49030: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000094e 12033 1726867201.49033: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003290", "end": "2024-09-20 17:20:01.446193", "rc": 0, "start": "2024-09-20 17:20:01.442903" } STDOUT: bonding_masters eth0 lo 12033 1726867201.49132: no more pending results, returning what we have 12033 1726867201.49136: results queue empty 12033 1726867201.49137: checking for any_errors_fatal 12033 1726867201.49139: done checking for any_errors_fatal 12033 1726867201.49139: checking for max_fail_percentage 12033 1726867201.49141: done checking for max_fail_percentage 12033 1726867201.49142: checking to see if all hosts have failed and the running result is not ok 12033 1726867201.49142: done checking to see if all hosts have failed 12033 1726867201.49143: getting the remaining hosts for this loop 12033 1726867201.49145: done getting the remaining hosts for this loop 12033 1726867201.49148: getting the next task for host managed_node3 12033 1726867201.49157: done getting next task for host managed_node3 12033 1726867201.49159: ^ task is: TASK: Set current_interfaces 12033 1726867201.49164: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867201.49168: getting variables 12033 1726867201.49169: in VariableManager get_vars() 12033 1726867201.49207: Calling all_inventory to load vars for managed_node3 12033 1726867201.49210: Calling groups_inventory to load vars for managed_node3 12033 1726867201.49212: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867201.49222: Calling all_plugins_play to load vars for managed_node3 12033 1726867201.49224: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867201.49226: Calling groups_plugins_play to load vars for managed_node3 12033 1726867201.50111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867201.50971: done with get_vars() 12033 1726867201.50989: done getting variables 12033 1726867201.51034: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 17:20:01 -0400 (0:00:00.378) 0:00:40.626 ****** 12033 1726867201.51059: entering _queue_task() for managed_node3/set_fact 12033 1726867201.51292: worker is 1 (out of 1 available) 12033 1726867201.51304: exiting _queue_task() for managed_node3/set_fact 12033 1726867201.51317: done queuing things up, now waiting for results queue to drain 12033 1726867201.51318: waiting for pending results... 12033 1726867201.51501: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 12033 1726867201.51594: in run() - task 0affcac9-a3a5-74bb-502b-00000000094f 12033 1726867201.51608: variable 'ansible_search_path' from source: unknown 12033 1726867201.51612: variable 'ansible_search_path' from source: unknown 12033 1726867201.51641: calling self._execute() 12033 1726867201.51717: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867201.51721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867201.51729: variable 'omit' from source: magic vars 12033 1726867201.52011: variable 'ansible_distribution_major_version' from source: facts 12033 1726867201.52021: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867201.52027: variable 'omit' from source: magic vars 12033 1726867201.52058: variable 'omit' from source: magic vars 12033 1726867201.52136: variable '_current_interfaces' from source: set_fact 12033 1726867201.52185: variable 'omit' from source: magic vars 12033 1726867201.52220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867201.52247: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867201.52262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867201.52275: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867201.52286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867201.52314: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867201.52317: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867201.52321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867201.52390: Set connection var ansible_pipelining to False 12033 1726867201.52399: Set connection var ansible_shell_executable to /bin/sh 12033 1726867201.52408: Set connection var ansible_timeout to 10 12033 1726867201.52411: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867201.52414: Set connection var ansible_connection to ssh 12033 1726867201.52417: Set connection var ansible_shell_type to sh 12033 1726867201.52442: variable 'ansible_shell_executable' from source: unknown 12033 1726867201.52446: variable 'ansible_connection' from source: unknown 12033 1726867201.52448: variable 'ansible_module_compression' from source: unknown 12033 1726867201.52450: variable 'ansible_shell_type' from source: unknown 12033 1726867201.52453: variable 'ansible_shell_executable' from source: unknown 12033 1726867201.52455: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867201.52457: variable 'ansible_pipelining' from source: unknown 12033 1726867201.52459: variable 'ansible_timeout' from source: unknown 12033 1726867201.52461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867201.52560: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867201.52571: variable 'omit' from source: magic vars 12033 1726867201.52574: starting attempt loop 12033 1726867201.52581: running the handler 12033 1726867201.52589: handler run complete 12033 1726867201.52600: attempt loop complete, returning result 12033 1726867201.52602: _execute() done 12033 1726867201.52605: dumping result to json 12033 1726867201.52607: done dumping result, returning 12033 1726867201.52613: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0affcac9-a3a5-74bb-502b-00000000094f] 12033 1726867201.52618: sending task result for task 0affcac9-a3a5-74bb-502b-00000000094f 12033 1726867201.52707: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000094f 12033 1726867201.52709: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 12033 1726867201.52783: no more pending results, returning what we have 12033 1726867201.52786: results queue empty 12033 1726867201.52787: checking for any_errors_fatal 12033 1726867201.52793: done checking for any_errors_fatal 12033 1726867201.52795: checking for max_fail_percentage 12033 1726867201.52796: done checking for max_fail_percentage 12033 1726867201.52797: checking to see if all hosts have failed and the running result is not ok 12033 1726867201.52798: done checking to see if all hosts have failed 12033 1726867201.52799: getting the remaining hosts for this loop 12033 1726867201.52800: done getting the remaining hosts for this loop 12033 1726867201.52803: getting the next task for host managed_node3 12033 1726867201.52810: done getting next task for host managed_node3 12033 1726867201.52812: ^ task is: TASK: Show current_interfaces 12033 1726867201.52816: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867201.52820: getting variables 12033 1726867201.52821: in VariableManager get_vars() 12033 1726867201.52849: Calling all_inventory to load vars for managed_node3 12033 1726867201.52852: Calling groups_inventory to load vars for managed_node3 12033 1726867201.52854: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867201.52862: Calling all_plugins_play to load vars for managed_node3 12033 1726867201.52864: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867201.52867: Calling groups_plugins_play to load vars for managed_node3 12033 1726867201.53610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867201.54480: done with get_vars() 12033 1726867201.54497: done getting variables 12033 1726867201.54537: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 17:20:01 -0400 (0:00:00.034) 0:00:40.661 ****** 12033 1726867201.54557: entering _queue_task() for managed_node3/debug 12033 1726867201.54753: worker is 1 (out of 1 available) 12033 1726867201.54766: exiting _queue_task() for managed_node3/debug 12033 1726867201.54779: done queuing things up, now waiting for results queue to drain 12033 1726867201.54781: waiting for pending results... 12033 1726867201.54957: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 12033 1726867201.55032: in run() - task 0affcac9-a3a5-74bb-502b-000000000914 12033 1726867201.55044: variable 'ansible_search_path' from source: unknown 12033 1726867201.55048: variable 'ansible_search_path' from source: unknown 12033 1726867201.55075: calling self._execute() 12033 1726867201.55149: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867201.55153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867201.55162: variable 'omit' from source: magic vars 12033 1726867201.55444: variable 'ansible_distribution_major_version' from source: facts 12033 1726867201.55458: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867201.55462: variable 'omit' from source: magic vars 12033 1726867201.55494: variable 'omit' from source: magic vars 12033 1726867201.55567: variable 'current_interfaces' from source: set_fact 12033 1726867201.55583: variable 'omit' from source: magic vars 12033 1726867201.55614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867201.55640: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867201.55655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867201.55671: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867201.55684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867201.55708: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867201.55711: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867201.55713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867201.55782: Set connection var ansible_pipelining to False 12033 1726867201.55795: Set connection var ansible_shell_executable to /bin/sh 12033 1726867201.55799: Set connection var ansible_timeout to 10 12033 1726867201.55801: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867201.55804: Set connection var ansible_connection to ssh 12033 1726867201.55806: Set connection var ansible_shell_type to sh 12033 1726867201.55821: variable 'ansible_shell_executable' from source: unknown 12033 1726867201.55824: variable 'ansible_connection' from source: unknown 12033 1726867201.55827: variable 'ansible_module_compression' from source: unknown 12033 1726867201.55829: variable 'ansible_shell_type' from source: unknown 12033 1726867201.55832: variable 'ansible_shell_executable' from source: unknown 12033 1726867201.55834: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867201.55838: variable 'ansible_pipelining' from source: unknown 12033 1726867201.55840: variable 'ansible_timeout' from source: unknown 12033 1726867201.55844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867201.55945: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867201.55953: variable 'omit' from source: magic vars 12033 1726867201.55959: starting attempt loop 12033 1726867201.55962: running the handler 12033 1726867201.56000: handler run complete 12033 1726867201.56013: attempt loop complete, returning result 12033 1726867201.56016: _execute() done 12033 1726867201.56019: dumping result to json 12033 1726867201.56021: done dumping result, returning 12033 1726867201.56027: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0affcac9-a3a5-74bb-502b-000000000914] 12033 1726867201.56031: sending task result for task 0affcac9-a3a5-74bb-502b-000000000914 12033 1726867201.56114: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000914 12033 1726867201.56117: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 12033 1726867201.56162: no more pending results, returning what we have 12033 1726867201.56165: results queue empty 12033 1726867201.56166: checking for any_errors_fatal 12033 1726867201.56172: done checking for any_errors_fatal 12033 1726867201.56172: checking for max_fail_percentage 12033 1726867201.56174: done checking for max_fail_percentage 12033 1726867201.56175: checking to see if all hosts have failed and the running result is not ok 12033 1726867201.56175: done checking to see if all hosts have failed 12033 1726867201.56176: getting the remaining hosts for this loop 12033 1726867201.56180: done getting the remaining hosts for this loop 12033 1726867201.56184: getting the next task for host managed_node3 12033 1726867201.56193: done getting next task for host managed_node3 12033 1726867201.56196: ^ task is: TASK: Setup 12033 1726867201.56199: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867201.56202: getting variables 12033 1726867201.56204: in VariableManager get_vars() 12033 1726867201.56234: Calling all_inventory to load vars for managed_node3 12033 1726867201.56236: Calling groups_inventory to load vars for managed_node3 12033 1726867201.56238: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867201.56247: Calling all_plugins_play to load vars for managed_node3 12033 1726867201.56249: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867201.56251: Calling groups_plugins_play to load vars for managed_node3 12033 1726867201.57634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867201.58494: done with get_vars() 12033 1726867201.58508: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 17:20:01 -0400 (0:00:00.040) 0:00:40.701 ****** 12033 1726867201.58566: entering _queue_task() for managed_node3/include_tasks 12033 1726867201.58763: worker is 1 (out of 1 available) 12033 1726867201.58775: exiting _queue_task() for managed_node3/include_tasks 12033 1726867201.58790: done queuing things up, now waiting for results queue to drain 12033 1726867201.58794: waiting for pending results... 12033 1726867201.58959: running TaskExecutor() for managed_node3/TASK: Setup 12033 1726867201.59025: in run() - task 0affcac9-a3a5-74bb-502b-0000000008ed 12033 1726867201.59041: variable 'ansible_search_path' from source: unknown 12033 1726867201.59044: variable 'ansible_search_path' from source: unknown 12033 1726867201.59075: variable 'lsr_setup' from source: include params 12033 1726867201.59227: variable 'lsr_setup' from source: include params 12033 1726867201.59354: variable 'omit' from source: magic vars 12033 1726867201.59421: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867201.59429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867201.59439: variable 'omit' from source: magic vars 12033 1726867201.60182: variable 'ansible_distribution_major_version' from source: facts 12033 1726867201.60638: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867201.60643: variable 'item' from source: unknown 12033 1726867201.60747: variable 'item' from source: unknown 12033 1726867201.60750: variable 'item' from source: unknown 12033 1726867201.60788: variable 'item' from source: unknown 12033 1726867201.61073: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867201.61076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867201.61081: variable 'omit' from source: magic vars 12033 1726867201.61659: variable 'ansible_distribution_major_version' from source: facts 12033 1726867201.61663: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867201.61665: variable 'item' from source: unknown 12033 1726867201.61667: variable 'item' from source: unknown 12033 1726867201.61670: variable 'item' from source: unknown 12033 1726867201.61908: variable 'item' from source: unknown 12033 1726867201.61974: dumping result to json 12033 1726867201.62086: done dumping result, returning 12033 1726867201.62095: done running TaskExecutor() for managed_node3/TASK: Setup [0affcac9-a3a5-74bb-502b-0000000008ed] 12033 1726867201.62098: sending task result for task 0affcac9-a3a5-74bb-502b-0000000008ed 12033 1726867201.62141: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000008ed 12033 1726867201.62144: WORKER PROCESS EXITING 12033 1726867201.62171: no more pending results, returning what we have 12033 1726867201.62180: in VariableManager get_vars() 12033 1726867201.62238: Calling all_inventory to load vars for managed_node3 12033 1726867201.62241: Calling groups_inventory to load vars for managed_node3 12033 1726867201.62243: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867201.62258: Calling all_plugins_play to load vars for managed_node3 12033 1726867201.62262: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867201.62265: Calling groups_plugins_play to load vars for managed_node3 12033 1726867201.64942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867201.66805: done with get_vars() 12033 1726867201.66825: variable 'ansible_search_path' from source: unknown 12033 1726867201.66826: variable 'ansible_search_path' from source: unknown 12033 1726867201.66869: variable 'ansible_search_path' from source: unknown 12033 1726867201.66870: variable 'ansible_search_path' from source: unknown 12033 1726867201.66909: we have included files to process 12033 1726867201.66910: generating all_blocks data 12033 1726867201.66912: done generating all_blocks data 12033 1726867201.66917: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 12033 1726867201.66918: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 12033 1726867201.66921: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 12033 1726867201.67929: done processing included file 12033 1726867201.67931: iterating over new_blocks loaded from include file 12033 1726867201.67933: in VariableManager get_vars() 12033 1726867201.67951: done with get_vars() 12033 1726867201.67953: filtering new block on tags 12033 1726867201.68016: done filtering new block on tags 12033 1726867201.68019: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml for managed_node3 => (item=tasks/create_test_interfaces_with_dhcp.yml) 12033 1726867201.68025: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 12033 1726867201.68026: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 12033 1726867201.68029: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 12033 1726867201.68128: in VariableManager get_vars() 12033 1726867201.68148: done with get_vars() 12033 1726867201.68154: variable 'item' from source: include params 12033 1726867201.68264: variable 'item' from source: include params 12033 1726867201.68308: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12033 1726867201.68389: in VariableManager get_vars() 12033 1726867201.68421: done with get_vars() 12033 1726867201.68558: in VariableManager get_vars() 12033 1726867201.68576: done with get_vars() 12033 1726867201.68585: variable 'item' from source: include params 12033 1726867201.68654: variable 'item' from source: include params 12033 1726867201.68683: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12033 1726867201.68823: in VariableManager get_vars() 12033 1726867201.68848: done with get_vars() 12033 1726867201.68935: done processing included file 12033 1726867201.68937: iterating over new_blocks loaded from include file 12033 1726867201.69028: in VariableManager get_vars() 12033 1726867201.69046: done with get_vars() 12033 1726867201.69048: filtering new block on tags 12033 1726867201.69128: done filtering new block on tags 12033 1726867201.69134: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml for managed_node3 => (item=tasks/assert_dhcp_device_present.yml) 12033 1726867201.69138: extending task lists for all hosts with included blocks 12033 1726867201.69754: done extending task lists 12033 1726867201.69756: done processing included files 12033 1726867201.69757: results queue empty 12033 1726867201.69757: checking for any_errors_fatal 12033 1726867201.69761: done checking for any_errors_fatal 12033 1726867201.69762: checking for max_fail_percentage 12033 1726867201.69763: done checking for max_fail_percentage 12033 1726867201.69764: checking to see if all hosts have failed and the running result is not ok 12033 1726867201.69765: done checking to see if all hosts have failed 12033 1726867201.69773: getting the remaining hosts for this loop 12033 1726867201.69775: done getting the remaining hosts for this loop 12033 1726867201.69780: getting the next task for host managed_node3 12033 1726867201.69785: done getting next task for host managed_node3 12033 1726867201.69787: ^ task is: TASK: Install dnsmasq 12033 1726867201.69790: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867201.69793: getting variables 12033 1726867201.69794: in VariableManager get_vars() 12033 1726867201.69805: Calling all_inventory to load vars for managed_node3 12033 1726867201.69807: Calling groups_inventory to load vars for managed_node3 12033 1726867201.69809: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867201.69815: Calling all_plugins_play to load vars for managed_node3 12033 1726867201.69817: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867201.69821: Calling groups_plugins_play to load vars for managed_node3 12033 1726867201.70942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867201.72511: done with get_vars() 12033 1726867201.72540: done getting variables 12033 1726867201.72589: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 17:20:01 -0400 (0:00:00.140) 0:00:40.841 ****** 12033 1726867201.72622: entering _queue_task() for managed_node3/package 12033 1726867201.72993: worker is 1 (out of 1 available) 12033 1726867201.73009: exiting _queue_task() for managed_node3/package 12033 1726867201.73023: done queuing things up, now waiting for results queue to drain 12033 1726867201.73025: waiting for pending results... 12033 1726867201.73400: running TaskExecutor() for managed_node3/TASK: Install dnsmasq 12033 1726867201.73436: in run() - task 0affcac9-a3a5-74bb-502b-000000000974 12033 1726867201.73456: variable 'ansible_search_path' from source: unknown 12033 1726867201.73462: variable 'ansible_search_path' from source: unknown 12033 1726867201.73521: calling self._execute() 12033 1726867201.73614: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867201.73738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867201.73746: variable 'omit' from source: magic vars 12033 1726867201.74064: variable 'ansible_distribution_major_version' from source: facts 12033 1726867201.74085: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867201.74101: variable 'omit' from source: magic vars 12033 1726867201.74152: variable 'omit' from source: magic vars 12033 1726867201.74366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867201.76825: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867201.76906: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867201.76951: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867201.77002: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867201.77033: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867201.77140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867201.77174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867201.77218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867201.77319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867201.77322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867201.77426: variable '__network_is_ostree' from source: set_fact 12033 1726867201.77430: variable 'omit' from source: magic vars 12033 1726867201.77452: variable 'omit' from source: magic vars 12033 1726867201.77486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867201.77521: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867201.77583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867201.77586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867201.77588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867201.77623: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867201.77630: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867201.77636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867201.77737: Set connection var ansible_pipelining to False 12033 1726867201.77761: Set connection var ansible_shell_executable to /bin/sh 12033 1726867201.77782: Set connection var ansible_timeout to 10 12033 1726867201.77784: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867201.77786: Set connection var ansible_connection to ssh 12033 1726867201.77863: Set connection var ansible_shell_type to sh 12033 1726867201.77868: variable 'ansible_shell_executable' from source: unknown 12033 1726867201.77871: variable 'ansible_connection' from source: unknown 12033 1726867201.77874: variable 'ansible_module_compression' from source: unknown 12033 1726867201.77878: variable 'ansible_shell_type' from source: unknown 12033 1726867201.77885: variable 'ansible_shell_executable' from source: unknown 12033 1726867201.77887: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867201.77889: variable 'ansible_pipelining' from source: unknown 12033 1726867201.77894: variable 'ansible_timeout' from source: unknown 12033 1726867201.77896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867201.78009: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867201.78085: variable 'omit' from source: magic vars 12033 1726867201.78088: starting attempt loop 12033 1726867201.78091: running the handler 12033 1726867201.78099: variable 'ansible_facts' from source: unknown 12033 1726867201.78102: variable 'ansible_facts' from source: unknown 12033 1726867201.78104: _low_level_execute_command(): starting 12033 1726867201.78107: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867201.78960: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867201.79029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867201.79068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867201.79099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867201.79137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867201.79237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867201.80910: stdout chunk (state=3): >>>/root <<< 12033 1726867201.81037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867201.81041: stdout chunk (state=3): >>><<< 12033 1726867201.81049: stderr chunk (state=3): >>><<< 12033 1726867201.81067: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867201.81086: _low_level_execute_command(): starting 12033 1726867201.81090: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867201.8106894-14062-40601547910662 `" && echo ansible-tmp-1726867201.8106894-14062-40601547910662="` echo /root/.ansible/tmp/ansible-tmp-1726867201.8106894-14062-40601547910662 `" ) && sleep 0' 12033 1726867201.81526: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867201.81530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867201.81533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867201.81535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867201.81585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867201.81602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867201.81641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867201.83539: stdout chunk (state=3): >>>ansible-tmp-1726867201.8106894-14062-40601547910662=/root/.ansible/tmp/ansible-tmp-1726867201.8106894-14062-40601547910662 <<< 12033 1726867201.83646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867201.83671: stderr chunk (state=3): >>><<< 12033 1726867201.83674: stdout chunk (state=3): >>><<< 12033 1726867201.83699: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867201.8106894-14062-40601547910662=/root/.ansible/tmp/ansible-tmp-1726867201.8106894-14062-40601547910662 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867201.83727: variable 'ansible_module_compression' from source: unknown 12033 1726867201.83770: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 12033 1726867201.83816: variable 'ansible_facts' from source: unknown 12033 1726867201.83878: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867201.8106894-14062-40601547910662/AnsiballZ_dnf.py 12033 1726867201.83985: Sending initial data 12033 1726867201.83988: Sent initial data (151 bytes) 12033 1726867201.84440: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867201.84444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867201.84447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867201.84450: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867201.84452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867201.84508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867201.84513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867201.84556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867201.86106: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 12033 1726867201.86110: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867201.86151: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867201.86196: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpurs87298 /root/.ansible/tmp/ansible-tmp-1726867201.8106894-14062-40601547910662/AnsiballZ_dnf.py <<< 12033 1726867201.86199: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867201.8106894-14062-40601547910662/AnsiballZ_dnf.py" <<< 12033 1726867201.86243: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpurs87298" to remote "/root/.ansible/tmp/ansible-tmp-1726867201.8106894-14062-40601547910662/AnsiballZ_dnf.py" <<< 12033 1726867201.86247: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867201.8106894-14062-40601547910662/AnsiballZ_dnf.py" <<< 12033 1726867201.86953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867201.86989: stderr chunk (state=3): >>><<< 12033 1726867201.86993: stdout chunk (state=3): >>><<< 12033 1726867201.87030: done transferring module to remote 12033 1726867201.87039: _low_level_execute_command(): starting 12033 1726867201.87044: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867201.8106894-14062-40601547910662/ /root/.ansible/tmp/ansible-tmp-1726867201.8106894-14062-40601547910662/AnsiballZ_dnf.py && sleep 0' 12033 1726867201.87626: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867201.87629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867201.87637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867201.87680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867201.87716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867201.89472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867201.89501: stderr chunk (state=3): >>><<< 12033 1726867201.89504: stdout chunk (state=3): >>><<< 12033 1726867201.89517: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867201.89520: _low_level_execute_command(): starting 12033 1726867201.89525: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867201.8106894-14062-40601547910662/AnsiballZ_dnf.py && sleep 0' 12033 1726867201.89950: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867201.89954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867201.89956: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867201.89958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867201.90006: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867201.90010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867201.90066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867202.31263: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 12033 1726867202.35710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867202.35735: stderr chunk (state=3): >>><<< 12033 1726867202.35744: stdout chunk (state=3): >>><<< 12033 1726867202.35897: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867202.35902: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867201.8106894-14062-40601547910662/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867202.35910: _low_level_execute_command(): starting 12033 1726867202.35913: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867201.8106894-14062-40601547910662/ > /dev/null 2>&1 && sleep 0' 12033 1726867202.36508: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867202.36523: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867202.36538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867202.36557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867202.36673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867202.36684: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867202.36710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867202.36727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867202.36809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867202.38661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867202.38669: stdout chunk (state=3): >>><<< 12033 1726867202.38768: stderr chunk (state=3): >>><<< 12033 1726867202.38887: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867202.38890: handler run complete 12033 1726867202.38950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867202.39145: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867202.39189: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867202.39224: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867202.39257: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867202.39335: variable '__install_status' from source: set_fact 12033 1726867202.39354: Evaluated conditional (__install_status is success): True 12033 1726867202.39370: attempt loop complete, returning result 12033 1726867202.39373: _execute() done 12033 1726867202.39375: dumping result to json 12033 1726867202.39382: done dumping result, returning 12033 1726867202.39390: done running TaskExecutor() for managed_node3/TASK: Install dnsmasq [0affcac9-a3a5-74bb-502b-000000000974] 12033 1726867202.39396: sending task result for task 0affcac9-a3a5-74bb-502b-000000000974 12033 1726867202.39509: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000974 12033 1726867202.39512: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 12033 1726867202.39626: no more pending results, returning what we have 12033 1726867202.39631: results queue empty 12033 1726867202.39632: checking for any_errors_fatal 12033 1726867202.39633: done checking for any_errors_fatal 12033 1726867202.39634: checking for max_fail_percentage 12033 1726867202.39638: done checking for max_fail_percentage 12033 1726867202.39639: checking to see if all hosts have failed and the running result is not ok 12033 1726867202.39640: done checking to see if all hosts have failed 12033 1726867202.39640: getting the remaining hosts for this loop 12033 1726867202.39642: done getting the remaining hosts for this loop 12033 1726867202.39646: getting the next task for host managed_node3 12033 1726867202.39652: done getting next task for host managed_node3 12033 1726867202.39654: ^ task is: TASK: Install pgrep, sysctl 12033 1726867202.39657: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867202.39660: getting variables 12033 1726867202.39662: in VariableManager get_vars() 12033 1726867202.39700: Calling all_inventory to load vars for managed_node3 12033 1726867202.39703: Calling groups_inventory to load vars for managed_node3 12033 1726867202.39704: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867202.39714: Calling all_plugins_play to load vars for managed_node3 12033 1726867202.39716: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867202.39719: Calling groups_plugins_play to load vars for managed_node3 12033 1726867202.41434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867202.42986: done with get_vars() 12033 1726867202.43007: done getting variables 12033 1726867202.43062: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 17:20:02 -0400 (0:00:00.704) 0:00:41.546 ****** 12033 1726867202.43095: entering _queue_task() for managed_node3/package 12033 1726867202.43396: worker is 1 (out of 1 available) 12033 1726867202.43408: exiting _queue_task() for managed_node3/package 12033 1726867202.43420: done queuing things up, now waiting for results queue to drain 12033 1726867202.43422: waiting for pending results... 12033 1726867202.43798: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 12033 1726867202.43808: in run() - task 0affcac9-a3a5-74bb-502b-000000000975 12033 1726867202.43894: variable 'ansible_search_path' from source: unknown 12033 1726867202.43899: variable 'ansible_search_path' from source: unknown 12033 1726867202.43902: calling self._execute() 12033 1726867202.43950: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867202.43953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867202.43964: variable 'omit' from source: magic vars 12033 1726867202.44329: variable 'ansible_distribution_major_version' from source: facts 12033 1726867202.44339: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867202.44455: variable 'ansible_os_family' from source: facts 12033 1726867202.44466: Evaluated conditional (ansible_os_family == 'RedHat'): True 12033 1726867202.44642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867202.44905: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867202.44945: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867202.44985: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867202.45068: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867202.45167: variable 'ansible_distribution_major_version' from source: facts 12033 1726867202.45191: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 12033 1726867202.45210: when evaluation is False, skipping this task 12033 1726867202.45217: _execute() done 12033 1726867202.45224: dumping result to json 12033 1726867202.45231: done dumping result, returning 12033 1726867202.45243: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0affcac9-a3a5-74bb-502b-000000000975] 12033 1726867202.45254: sending task result for task 0affcac9-a3a5-74bb-502b-000000000975 12033 1726867202.45556: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000975 12033 1726867202.45560: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 12033 1726867202.45612: no more pending results, returning what we have 12033 1726867202.45616: results queue empty 12033 1726867202.45617: checking for any_errors_fatal 12033 1726867202.45626: done checking for any_errors_fatal 12033 1726867202.45627: checking for max_fail_percentage 12033 1726867202.45629: done checking for max_fail_percentage 12033 1726867202.45629: checking to see if all hosts have failed and the running result is not ok 12033 1726867202.45630: done checking to see if all hosts have failed 12033 1726867202.45631: getting the remaining hosts for this loop 12033 1726867202.45633: done getting the remaining hosts for this loop 12033 1726867202.45636: getting the next task for host managed_node3 12033 1726867202.45643: done getting next task for host managed_node3 12033 1726867202.45645: ^ task is: TASK: Install pgrep, sysctl 12033 1726867202.45649: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867202.45653: getting variables 12033 1726867202.45654: in VariableManager get_vars() 12033 1726867202.45690: Calling all_inventory to load vars for managed_node3 12033 1726867202.45693: Calling groups_inventory to load vars for managed_node3 12033 1726867202.45695: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867202.45704: Calling all_plugins_play to load vars for managed_node3 12033 1726867202.45707: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867202.45709: Calling groups_plugins_play to load vars for managed_node3 12033 1726867202.47353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867202.48544: done with get_vars() 12033 1726867202.48559: done getting variables 12033 1726867202.48607: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 17:20:02 -0400 (0:00:00.055) 0:00:41.602 ****** 12033 1726867202.48629: entering _queue_task() for managed_node3/package 12033 1726867202.48846: worker is 1 (out of 1 available) 12033 1726867202.48860: exiting _queue_task() for managed_node3/package 12033 1726867202.48873: done queuing things up, now waiting for results queue to drain 12033 1726867202.48875: waiting for pending results... 12033 1726867202.49053: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 12033 1726867202.49130: in run() - task 0affcac9-a3a5-74bb-502b-000000000976 12033 1726867202.49143: variable 'ansible_search_path' from source: unknown 12033 1726867202.49147: variable 'ansible_search_path' from source: unknown 12033 1726867202.49173: calling self._execute() 12033 1726867202.49249: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867202.49253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867202.49262: variable 'omit' from source: magic vars 12033 1726867202.49532: variable 'ansible_distribution_major_version' from source: facts 12033 1726867202.49546: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867202.49640: variable 'ansible_os_family' from source: facts 12033 1726867202.49649: Evaluated conditional (ansible_os_family == 'RedHat'): True 12033 1726867202.49914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867202.50094: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867202.50128: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867202.50159: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867202.50204: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867202.50260: variable 'ansible_distribution_major_version' from source: facts 12033 1726867202.50271: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 12033 1726867202.50308: variable 'omit' from source: magic vars 12033 1726867202.50318: variable 'omit' from source: magic vars 12033 1726867202.50454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867202.52068: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867202.52118: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867202.52145: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867202.52173: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867202.52195: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867202.52259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867202.52285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867202.52304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867202.52329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867202.52339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867202.52406: variable '__network_is_ostree' from source: set_fact 12033 1726867202.52409: variable 'omit' from source: magic vars 12033 1726867202.52430: variable 'omit' from source: magic vars 12033 1726867202.52452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867202.52472: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867202.52496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867202.52504: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867202.52514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867202.52536: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867202.52538: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867202.52541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867202.52611: Set connection var ansible_pipelining to False 12033 1726867202.52618: Set connection var ansible_shell_executable to /bin/sh 12033 1726867202.52625: Set connection var ansible_timeout to 10 12033 1726867202.52630: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867202.52633: Set connection var ansible_connection to ssh 12033 1726867202.52637: Set connection var ansible_shell_type to sh 12033 1726867202.52654: variable 'ansible_shell_executable' from source: unknown 12033 1726867202.52657: variable 'ansible_connection' from source: unknown 12033 1726867202.52659: variable 'ansible_module_compression' from source: unknown 12033 1726867202.52661: variable 'ansible_shell_type' from source: unknown 12033 1726867202.52664: variable 'ansible_shell_executable' from source: unknown 12033 1726867202.52666: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867202.52671: variable 'ansible_pipelining' from source: unknown 12033 1726867202.52673: variable 'ansible_timeout' from source: unknown 12033 1726867202.52679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867202.52748: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867202.52756: variable 'omit' from source: magic vars 12033 1726867202.52762: starting attempt loop 12033 1726867202.52765: running the handler 12033 1726867202.52771: variable 'ansible_facts' from source: unknown 12033 1726867202.52773: variable 'ansible_facts' from source: unknown 12033 1726867202.52813: _low_level_execute_command(): starting 12033 1726867202.52816: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867202.53528: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867202.53611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867202.53626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867202.53687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867202.55357: stdout chunk (state=3): >>>/root <<< 12033 1726867202.55456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867202.55485: stderr chunk (state=3): >>><<< 12033 1726867202.55488: stdout chunk (state=3): >>><<< 12033 1726867202.55506: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867202.55515: _low_level_execute_command(): starting 12033 1726867202.55521: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867202.5550508-14095-240761084580779 `" && echo ansible-tmp-1726867202.5550508-14095-240761084580779="` echo /root/.ansible/tmp/ansible-tmp-1726867202.5550508-14095-240761084580779 `" ) && sleep 0' 12033 1726867202.55950: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867202.56020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867202.56039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867202.56105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867202.58012: stdout chunk (state=3): >>>ansible-tmp-1726867202.5550508-14095-240761084580779=/root/.ansible/tmp/ansible-tmp-1726867202.5550508-14095-240761084580779 <<< 12033 1726867202.58124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867202.58144: stderr chunk (state=3): >>><<< 12033 1726867202.58147: stdout chunk (state=3): >>><<< 12033 1726867202.58159: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867202.5550508-14095-240761084580779=/root/.ansible/tmp/ansible-tmp-1726867202.5550508-14095-240761084580779 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867202.58185: variable 'ansible_module_compression' from source: unknown 12033 1726867202.58235: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 12033 1726867202.58269: variable 'ansible_facts' from source: unknown 12033 1726867202.58353: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867202.5550508-14095-240761084580779/AnsiballZ_dnf.py 12033 1726867202.58451: Sending initial data 12033 1726867202.58455: Sent initial data (152 bytes) 12033 1726867202.58853: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867202.58857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867202.58859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867202.58861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867202.58863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867202.58913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867202.58918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867202.58962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867202.60519: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 12033 1726867202.60524: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867202.60557: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867202.60604: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmppesbaiyg /root/.ansible/tmp/ansible-tmp-1726867202.5550508-14095-240761084580779/AnsiballZ_dnf.py <<< 12033 1726867202.60610: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867202.5550508-14095-240761084580779/AnsiballZ_dnf.py" <<< 12033 1726867202.60645: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmppesbaiyg" to remote "/root/.ansible/tmp/ansible-tmp-1726867202.5550508-14095-240761084580779/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867202.5550508-14095-240761084580779/AnsiballZ_dnf.py" <<< 12033 1726867202.61314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867202.61340: stderr chunk (state=3): >>><<< 12033 1726867202.61343: stdout chunk (state=3): >>><<< 12033 1726867202.61376: done transferring module to remote 12033 1726867202.61388: _low_level_execute_command(): starting 12033 1726867202.61394: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867202.5550508-14095-240761084580779/ /root/.ansible/tmp/ansible-tmp-1726867202.5550508-14095-240761084580779/AnsiballZ_dnf.py && sleep 0' 12033 1726867202.61774: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867202.61780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867202.61792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867202.61845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867202.61848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867202.61897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867202.63663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867202.63686: stderr chunk (state=3): >>><<< 12033 1726867202.63690: stdout chunk (state=3): >>><<< 12033 1726867202.63705: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867202.63708: _low_level_execute_command(): starting 12033 1726867202.63710: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867202.5550508-14095-240761084580779/AnsiballZ_dnf.py && sleep 0' 12033 1726867202.64119: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867202.64122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867202.64124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867202.64127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867202.64129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867202.64176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867202.64183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867202.64233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867203.05023: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 12033 1726867203.09152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867203.09185: stdout chunk (state=3): >>><<< 12033 1726867203.09189: stderr chunk (state=3): >>><<< 12033 1726867203.09330: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867203.09340: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867202.5550508-14095-240761084580779/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867203.09343: _low_level_execute_command(): starting 12033 1726867203.09345: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867202.5550508-14095-240761084580779/ > /dev/null 2>&1 && sleep 0' 12033 1726867203.09874: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867203.09910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867203.09993: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867203.10027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867203.10045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867203.10068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867203.10152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867203.11997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867203.12014: stdout chunk (state=3): >>><<< 12033 1726867203.12026: stderr chunk (state=3): >>><<< 12033 1726867203.12047: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867203.12059: handler run complete 12033 1726867203.12182: attempt loop complete, returning result 12033 1726867203.12185: _execute() done 12033 1726867203.12187: dumping result to json 12033 1726867203.12189: done dumping result, returning 12033 1726867203.12191: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0affcac9-a3a5-74bb-502b-000000000976] 12033 1726867203.12193: sending task result for task 0affcac9-a3a5-74bb-502b-000000000976 12033 1726867203.12265: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000976 12033 1726867203.12268: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 12033 1726867203.12353: no more pending results, returning what we have 12033 1726867203.12357: results queue empty 12033 1726867203.12358: checking for any_errors_fatal 12033 1726867203.12363: done checking for any_errors_fatal 12033 1726867203.12364: checking for max_fail_percentage 12033 1726867203.12366: done checking for max_fail_percentage 12033 1726867203.12367: checking to see if all hosts have failed and the running result is not ok 12033 1726867203.12368: done checking to see if all hosts have failed 12033 1726867203.12369: getting the remaining hosts for this loop 12033 1726867203.12371: done getting the remaining hosts for this loop 12033 1726867203.12376: getting the next task for host managed_node3 12033 1726867203.12385: done getting next task for host managed_node3 12033 1726867203.12387: ^ task is: TASK: Create test interfaces 12033 1726867203.12391: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867203.12396: getting variables 12033 1726867203.12397: in VariableManager get_vars() 12033 1726867203.12438: Calling all_inventory to load vars for managed_node3 12033 1726867203.12441: Calling groups_inventory to load vars for managed_node3 12033 1726867203.12443: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867203.12454: Calling all_plugins_play to load vars for managed_node3 12033 1726867203.12457: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867203.12460: Calling groups_plugins_play to load vars for managed_node3 12033 1726867203.14402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867203.16020: done with get_vars() 12033 1726867203.16043: done getting variables 12033 1726867203.16117: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 17:20:03 -0400 (0:00:00.675) 0:00:42.277 ****** 12033 1726867203.16150: entering _queue_task() for managed_node3/shell 12033 1726867203.16532: worker is 1 (out of 1 available) 12033 1726867203.16589: exiting _queue_task() for managed_node3/shell 12033 1726867203.16600: done queuing things up, now waiting for results queue to drain 12033 1726867203.16602: waiting for pending results... 12033 1726867203.17068: running TaskExecutor() for managed_node3/TASK: Create test interfaces 12033 1726867203.17073: in run() - task 0affcac9-a3a5-74bb-502b-000000000977 12033 1726867203.17079: variable 'ansible_search_path' from source: unknown 12033 1726867203.17082: variable 'ansible_search_path' from source: unknown 12033 1726867203.17086: calling self._execute() 12033 1726867203.17153: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867203.17176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867203.17275: variable 'omit' from source: magic vars 12033 1726867203.17594: variable 'ansible_distribution_major_version' from source: facts 12033 1726867203.17618: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867203.17629: variable 'omit' from source: magic vars 12033 1726867203.17681: variable 'omit' from source: magic vars 12033 1726867203.18138: variable 'dhcp_interface1' from source: play vars 12033 1726867203.18146: variable 'dhcp_interface2' from source: play vars 12033 1726867203.18149: variable 'omit' from source: magic vars 12033 1726867203.18175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867203.18247: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867203.18255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867203.18278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867203.18297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867203.18355: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867203.18358: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867203.18364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867203.18463: Set connection var ansible_pipelining to False 12033 1726867203.18491: Set connection var ansible_shell_executable to /bin/sh 12033 1726867203.18584: Set connection var ansible_timeout to 10 12033 1726867203.18588: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867203.18591: Set connection var ansible_connection to ssh 12033 1726867203.18593: Set connection var ansible_shell_type to sh 12033 1726867203.18595: variable 'ansible_shell_executable' from source: unknown 12033 1726867203.18597: variable 'ansible_connection' from source: unknown 12033 1726867203.18599: variable 'ansible_module_compression' from source: unknown 12033 1726867203.18601: variable 'ansible_shell_type' from source: unknown 12033 1726867203.18603: variable 'ansible_shell_executable' from source: unknown 12033 1726867203.18605: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867203.18607: variable 'ansible_pipelining' from source: unknown 12033 1726867203.18609: variable 'ansible_timeout' from source: unknown 12033 1726867203.18611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867203.18784: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867203.18787: variable 'omit' from source: magic vars 12033 1726867203.18790: starting attempt loop 12033 1726867203.18792: running the handler 12033 1726867203.18836: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867203.18843: _low_level_execute_command(): starting 12033 1726867203.18845: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867203.19606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867203.19694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867203.19748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867203.19764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867203.19788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867203.19967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867203.21635: stdout chunk (state=3): >>>/root <<< 12033 1726867203.21784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867203.21808: stdout chunk (state=3): >>><<< 12033 1726867203.21811: stderr chunk (state=3): >>><<< 12033 1726867203.21830: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867203.21938: _low_level_execute_command(): starting 12033 1726867203.21941: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867203.2183979-14135-57920404384118 `" && echo ansible-tmp-1726867203.2183979-14135-57920404384118="` echo /root/.ansible/tmp/ansible-tmp-1726867203.2183979-14135-57920404384118 `" ) && sleep 0' 12033 1726867203.23121: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867203.23132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867203.23136: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867203.23138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867203.23250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867203.25207: stdout chunk (state=3): >>>ansible-tmp-1726867203.2183979-14135-57920404384118=/root/.ansible/tmp/ansible-tmp-1726867203.2183979-14135-57920404384118 <<< 12033 1726867203.25221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867203.25405: stderr chunk (state=3): >>><<< 12033 1726867203.25408: stdout chunk (state=3): >>><<< 12033 1726867203.25588: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867203.2183979-14135-57920404384118=/root/.ansible/tmp/ansible-tmp-1726867203.2183979-14135-57920404384118 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867203.25592: variable 'ansible_module_compression' from source: unknown 12033 1726867203.25594: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867203.25596: variable 'ansible_facts' from source: unknown 12033 1726867203.26043: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867203.2183979-14135-57920404384118/AnsiballZ_command.py 12033 1726867203.26436: Sending initial data 12033 1726867203.26447: Sent initial data (155 bytes) 12033 1726867203.27559: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867203.27578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867203.27813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867203.27816: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867203.27819: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867203.27822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867203.27948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867203.28014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867203.29583: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867203.29617: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867203.29658: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmparyr4x_h /root/.ansible/tmp/ansible-tmp-1726867203.2183979-14135-57920404384118/AnsiballZ_command.py <<< 12033 1726867203.29662: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867203.2183979-14135-57920404384118/AnsiballZ_command.py" <<< 12033 1726867203.29717: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmparyr4x_h" to remote "/root/.ansible/tmp/ansible-tmp-1726867203.2183979-14135-57920404384118/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867203.2183979-14135-57920404384118/AnsiballZ_command.py" <<< 12033 1726867203.31087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867203.31113: stderr chunk (state=3): >>><<< 12033 1726867203.31116: stdout chunk (state=3): >>><<< 12033 1726867203.31272: done transferring module to remote 12033 1726867203.31285: _low_level_execute_command(): starting 12033 1726867203.31289: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867203.2183979-14135-57920404384118/ /root/.ansible/tmp/ansible-tmp-1726867203.2183979-14135-57920404384118/AnsiballZ_command.py && sleep 0' 12033 1726867203.31921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867203.31930: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867203.32004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867203.32041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867203.32059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867203.32082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867203.32142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867203.33960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867203.33963: stdout chunk (state=3): >>><<< 12033 1726867203.33983: stderr chunk (state=3): >>><<< 12033 1726867203.34230: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867203.34234: _low_level_execute_command(): starting 12033 1726867203.34236: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867203.2183979-14135-57920404384118/AnsiballZ_command.py && sleep 0' 12033 1726867203.35383: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867203.35387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867203.35404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867203.35542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867203.35573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867203.35774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867204.72368: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 702 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 702 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 17:20:03.506939", "end": "2024-09-20 17:20:04.720677", "delta": "0:00:01.213738", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867204.74129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867204.74133: stdout chunk (state=3): >>><<< 12033 1726867204.74135: stderr chunk (state=3): >>><<< 12033 1726867204.74236: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 702 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 702 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 17:20:03.506939", "end": "2024-09-20 17:20:04.720677", "delta": "0:00:01.213738", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867204.74560: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867203.2183979-14135-57920404384118/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867204.74563: _low_level_execute_command(): starting 12033 1726867204.74566: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867203.2183979-14135-57920404384118/ > /dev/null 2>&1 && sleep 0' 12033 1726867204.75503: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867204.75597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867204.75784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867204.76001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867204.76104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867204.77960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867204.77971: stdout chunk (state=3): >>><<< 12033 1726867204.77986: stderr chunk (state=3): >>><<< 12033 1726867204.78008: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867204.78021: handler run complete 12033 1726867204.78050: Evaluated conditional (False): False 12033 1726867204.78066: attempt loop complete, returning result 12033 1726867204.78075: _execute() done 12033 1726867204.78083: dumping result to json 12033 1726867204.78094: done dumping result, returning 12033 1726867204.78107: done running TaskExecutor() for managed_node3/TASK: Create test interfaces [0affcac9-a3a5-74bb-502b-000000000977] 12033 1726867204.78117: sending task result for task 0affcac9-a3a5-74bb-502b-000000000977 ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.213738", "end": "2024-09-20 17:20:04.720677", "rc": 0, "start": "2024-09-20 17:20:03.506939" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 702 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 702 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 12033 1726867204.78316: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000977 12033 1726867204.78336: no more pending results, returning what we have 12033 1726867204.78343: results queue empty 12033 1726867204.78344: checking for any_errors_fatal 12033 1726867204.78357: done checking for any_errors_fatal 12033 1726867204.78357: checking for max_fail_percentage 12033 1726867204.78359: done checking for max_fail_percentage 12033 1726867204.78360: checking to see if all hosts have failed and the running result is not ok 12033 1726867204.78361: done checking to see if all hosts have failed 12033 1726867204.78362: getting the remaining hosts for this loop 12033 1726867204.78364: done getting the remaining hosts for this loop 12033 1726867204.78367: getting the next task for host managed_node3 12033 1726867204.78379: done getting next task for host managed_node3 12033 1726867204.78381: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12033 1726867204.78387: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867204.78390: getting variables 12033 1726867204.78394: in VariableManager get_vars() 12033 1726867204.78555: Calling all_inventory to load vars for managed_node3 12033 1726867204.78558: Calling groups_inventory to load vars for managed_node3 12033 1726867204.78561: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867204.78571: Calling all_plugins_play to load vars for managed_node3 12033 1726867204.78574: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867204.78608: Calling groups_plugins_play to load vars for managed_node3 12033 1726867204.78621: WORKER PROCESS EXITING 12033 1726867204.79883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867204.85700: done with get_vars() 12033 1726867204.85723: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:20:04 -0400 (0:00:01.696) 0:00:43.973 ****** 12033 1726867204.85804: entering _queue_task() for managed_node3/include_tasks 12033 1726867204.86138: worker is 1 (out of 1 available) 12033 1726867204.86150: exiting _queue_task() for managed_node3/include_tasks 12033 1726867204.86162: done queuing things up, now waiting for results queue to drain 12033 1726867204.86164: waiting for pending results... 12033 1726867204.86444: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 12033 1726867204.86583: in run() - task 0affcac9-a3a5-74bb-502b-00000000097e 12033 1726867204.86601: variable 'ansible_search_path' from source: unknown 12033 1726867204.86607: variable 'ansible_search_path' from source: unknown 12033 1726867204.86641: calling self._execute() 12033 1726867204.86884: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867204.86887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867204.86890: variable 'omit' from source: magic vars 12033 1726867204.87118: variable 'ansible_distribution_major_version' from source: facts 12033 1726867204.87130: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867204.87136: _execute() done 12033 1726867204.87140: dumping result to json 12033 1726867204.87143: done dumping result, returning 12033 1726867204.87148: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-74bb-502b-00000000097e] 12033 1726867204.87154: sending task result for task 0affcac9-a3a5-74bb-502b-00000000097e 12033 1726867204.87382: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000097e 12033 1726867204.87385: WORKER PROCESS EXITING 12033 1726867204.87406: no more pending results, returning what we have 12033 1726867204.87410: in VariableManager get_vars() 12033 1726867204.87443: Calling all_inventory to load vars for managed_node3 12033 1726867204.87446: Calling groups_inventory to load vars for managed_node3 12033 1726867204.87448: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867204.87457: Calling all_plugins_play to load vars for managed_node3 12033 1726867204.87459: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867204.87461: Calling groups_plugins_play to load vars for managed_node3 12033 1726867204.88666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867204.90185: done with get_vars() 12033 1726867204.90203: variable 'ansible_search_path' from source: unknown 12033 1726867204.90205: variable 'ansible_search_path' from source: unknown 12033 1726867204.90239: we have included files to process 12033 1726867204.90241: generating all_blocks data 12033 1726867204.90243: done generating all_blocks data 12033 1726867204.90248: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12033 1726867204.90249: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12033 1726867204.90252: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12033 1726867204.90428: done processing included file 12033 1726867204.90430: iterating over new_blocks loaded from include file 12033 1726867204.90432: in VariableManager get_vars() 12033 1726867204.90452: done with get_vars() 12033 1726867204.90454: filtering new block on tags 12033 1726867204.90487: done filtering new block on tags 12033 1726867204.90489: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 12033 1726867204.90494: extending task lists for all hosts with included blocks 12033 1726867204.90773: done extending task lists 12033 1726867204.90774: done processing included files 12033 1726867204.90775: results queue empty 12033 1726867204.90776: checking for any_errors_fatal 12033 1726867204.90783: done checking for any_errors_fatal 12033 1726867204.90784: checking for max_fail_percentage 12033 1726867204.90785: done checking for max_fail_percentage 12033 1726867204.90786: checking to see if all hosts have failed and the running result is not ok 12033 1726867204.90787: done checking to see if all hosts have failed 12033 1726867204.90788: getting the remaining hosts for this loop 12033 1726867204.90789: done getting the remaining hosts for this loop 12033 1726867204.90791: getting the next task for host managed_node3 12033 1726867204.90796: done getting next task for host managed_node3 12033 1726867204.90798: ^ task is: TASK: Get stat for interface {{ interface }} 12033 1726867204.90802: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867204.90804: getting variables 12033 1726867204.90805: in VariableManager get_vars() 12033 1726867204.90816: Calling all_inventory to load vars for managed_node3 12033 1726867204.90818: Calling groups_inventory to load vars for managed_node3 12033 1726867204.90820: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867204.90825: Calling all_plugins_play to load vars for managed_node3 12033 1726867204.90827: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867204.90830: Calling groups_plugins_play to load vars for managed_node3 12033 1726867204.92117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867204.93693: done with get_vars() 12033 1726867204.93712: done getting variables 12033 1726867204.93862: variable 'interface' from source: task vars 12033 1726867204.93866: variable 'dhcp_interface1' from source: play vars 12033 1726867204.93926: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:20:04 -0400 (0:00:00.081) 0:00:44.055 ****** 12033 1726867204.93958: entering _queue_task() for managed_node3/stat 12033 1726867204.94257: worker is 1 (out of 1 available) 12033 1726867204.94270: exiting _queue_task() for managed_node3/stat 12033 1726867204.94285: done queuing things up, now waiting for results queue to drain 12033 1726867204.94287: waiting for pending results... 12033 1726867204.94547: running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 12033 1726867204.94681: in run() - task 0affcac9-a3a5-74bb-502b-0000000009dd 12033 1726867204.94693: variable 'ansible_search_path' from source: unknown 12033 1726867204.94702: variable 'ansible_search_path' from source: unknown 12033 1726867204.94736: calling self._execute() 12033 1726867204.94830: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867204.94835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867204.94845: variable 'omit' from source: magic vars 12033 1726867204.95247: variable 'ansible_distribution_major_version' from source: facts 12033 1726867204.95250: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867204.95253: variable 'omit' from source: magic vars 12033 1726867204.95291: variable 'omit' from source: magic vars 12033 1726867204.95386: variable 'interface' from source: task vars 12033 1726867204.95390: variable 'dhcp_interface1' from source: play vars 12033 1726867204.95453: variable 'dhcp_interface1' from source: play vars 12033 1726867204.95475: variable 'omit' from source: magic vars 12033 1726867204.95574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867204.95580: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867204.95583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867204.95594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867204.95610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867204.95642: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867204.95645: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867204.95648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867204.95751: Set connection var ansible_pipelining to False 12033 1726867204.95758: Set connection var ansible_shell_executable to /bin/sh 12033 1726867204.95792: Set connection var ansible_timeout to 10 12033 1726867204.95795: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867204.95798: Set connection var ansible_connection to ssh 12033 1726867204.95800: Set connection var ansible_shell_type to sh 12033 1726867204.95802: variable 'ansible_shell_executable' from source: unknown 12033 1726867204.95807: variable 'ansible_connection' from source: unknown 12033 1726867204.95883: variable 'ansible_module_compression' from source: unknown 12033 1726867204.95887: variable 'ansible_shell_type' from source: unknown 12033 1726867204.95889: variable 'ansible_shell_executable' from source: unknown 12033 1726867204.95892: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867204.95895: variable 'ansible_pipelining' from source: unknown 12033 1726867204.95898: variable 'ansible_timeout' from source: unknown 12033 1726867204.95901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867204.96048: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867204.96067: variable 'omit' from source: magic vars 12033 1726867204.96080: starting attempt loop 12033 1726867204.96088: running the handler 12033 1726867204.96107: _low_level_execute_command(): starting 12033 1726867204.96232: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867204.96991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867204.97307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867204.97311: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867204.97315: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867204.97318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867204.97365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867204.97381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867204.97403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867204.97479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867204.99166: stdout chunk (state=3): >>>/root <<< 12033 1726867204.99295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867204.99319: stderr chunk (state=3): >>><<< 12033 1726867204.99330: stdout chunk (state=3): >>><<< 12033 1726867204.99357: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867204.99380: _low_level_execute_command(): starting 12033 1726867204.99391: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867204.993641-14220-191980161908422 `" && echo ansible-tmp-1726867204.993641-14220-191980161908422="` echo /root/.ansible/tmp/ansible-tmp-1726867204.993641-14220-191980161908422 `" ) && sleep 0' 12033 1726867204.99971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867204.99988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867205.00005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867205.00094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867205.00140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867205.00158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867205.00187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867205.00259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867205.02190: stdout chunk (state=3): >>>ansible-tmp-1726867204.993641-14220-191980161908422=/root/.ansible/tmp/ansible-tmp-1726867204.993641-14220-191980161908422 <<< 12033 1726867205.02332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867205.02342: stdout chunk (state=3): >>><<< 12033 1726867205.02376: stderr chunk (state=3): >>><<< 12033 1726867205.02583: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867204.993641-14220-191980161908422=/root/.ansible/tmp/ansible-tmp-1726867204.993641-14220-191980161908422 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867205.02587: variable 'ansible_module_compression' from source: unknown 12033 1726867205.02590: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12033 1726867205.02592: variable 'ansible_facts' from source: unknown 12033 1726867205.02650: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867204.993641-14220-191980161908422/AnsiballZ_stat.py 12033 1726867205.02802: Sending initial data 12033 1726867205.02805: Sent initial data (152 bytes) 12033 1726867205.03474: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867205.03544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867205.03587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867205.05132: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867205.05190: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867205.05268: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmph7ls6_z6 /root/.ansible/tmp/ansible-tmp-1726867204.993641-14220-191980161908422/AnsiballZ_stat.py <<< 12033 1726867205.05272: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867204.993641-14220-191980161908422/AnsiballZ_stat.py" <<< 12033 1726867205.05313: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmph7ls6_z6" to remote "/root/.ansible/tmp/ansible-tmp-1726867204.993641-14220-191980161908422/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867204.993641-14220-191980161908422/AnsiballZ_stat.py" <<< 12033 1726867205.06036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867205.06094: stderr chunk (state=3): >>><<< 12033 1726867205.06105: stdout chunk (state=3): >>><<< 12033 1726867205.06140: done transferring module to remote 12033 1726867205.06155: _low_level_execute_command(): starting 12033 1726867205.06164: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867204.993641-14220-191980161908422/ /root/.ansible/tmp/ansible-tmp-1726867204.993641-14220-191980161908422/AnsiballZ_stat.py && sleep 0' 12033 1726867205.06830: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867205.06851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867205.06867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867205.06891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867205.06908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867205.06923: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867205.06943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867205.07039: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867205.07080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867205.07104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867205.07190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867205.08964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867205.08971: stdout chunk (state=3): >>><<< 12033 1726867205.08981: stderr chunk (state=3): >>><<< 12033 1726867205.09001: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867205.09004: _low_level_execute_command(): starting 12033 1726867205.09007: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867204.993641-14220-191980161908422/AnsiballZ_stat.py && sleep 0' 12033 1726867205.09406: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867205.09409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867205.09412: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867205.09414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867205.09459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867205.09464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867205.09511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867205.24911: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28768, "dev": 23, "nlink": 1, "atime": 1726867203.5135262, "mtime": 1726867203.5135262, "ctime": 1726867203.5135262, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12033 1726867205.26215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867205.26237: stderr chunk (state=3): >>><<< 12033 1726867205.26240: stdout chunk (state=3): >>><<< 12033 1726867205.26256: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28768, "dev": 23, "nlink": 1, "atime": 1726867203.5135262, "mtime": 1726867203.5135262, "ctime": 1726867203.5135262, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867205.26298: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867204.993641-14220-191980161908422/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867205.26310: _low_level_execute_command(): starting 12033 1726867205.26319: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867204.993641-14220-191980161908422/ > /dev/null 2>&1 && sleep 0' 12033 1726867205.26756: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867205.26766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867205.26769: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867205.26771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867205.26816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867205.26820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867205.26869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867205.28738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867205.28749: stderr chunk (state=3): >>><<< 12033 1726867205.28757: stdout chunk (state=3): >>><<< 12033 1726867205.28782: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867205.28813: handler run complete 12033 1726867205.28852: attempt loop complete, returning result 12033 1726867205.28860: _execute() done 12033 1726867205.28866: dumping result to json 12033 1726867205.28922: done dumping result, returning 12033 1726867205.28925: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 [0affcac9-a3a5-74bb-502b-0000000009dd] 12033 1726867205.28927: sending task result for task 0affcac9-a3a5-74bb-502b-0000000009dd ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726867203.5135262, "block_size": 4096, "blocks": 0, "ctime": 1726867203.5135262, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28768, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726867203.5135262, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12033 1726867205.29245: no more pending results, returning what we have 12033 1726867205.29249: results queue empty 12033 1726867205.29250: checking for any_errors_fatal 12033 1726867205.29251: done checking for any_errors_fatal 12033 1726867205.29252: checking for max_fail_percentage 12033 1726867205.29254: done checking for max_fail_percentage 12033 1726867205.29255: checking to see if all hosts have failed and the running result is not ok 12033 1726867205.29256: done checking to see if all hosts have failed 12033 1726867205.29257: getting the remaining hosts for this loop 12033 1726867205.29259: done getting the remaining hosts for this loop 12033 1726867205.29262: getting the next task for host managed_node3 12033 1726867205.29272: done getting next task for host managed_node3 12033 1726867205.29274: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12033 1726867205.29280: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867205.29285: getting variables 12033 1726867205.29286: in VariableManager get_vars() 12033 1726867205.29326: Calling all_inventory to load vars for managed_node3 12033 1726867205.29329: Calling groups_inventory to load vars for managed_node3 12033 1726867205.29331: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867205.29472: Calling all_plugins_play to load vars for managed_node3 12033 1726867205.29476: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867205.29481: Calling groups_plugins_play to load vars for managed_node3 12033 1726867205.30084: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000009dd 12033 1726867205.30088: WORKER PROCESS EXITING 12033 1726867205.30964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867205.32586: done with get_vars() 12033 1726867205.32617: done getting variables 12033 1726867205.32659: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867205.32753: variable 'interface' from source: task vars 12033 1726867205.32757: variable 'dhcp_interface1' from source: play vars 12033 1726867205.32800: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:20:05 -0400 (0:00:00.388) 0:00:44.444 ****** 12033 1726867205.32831: entering _queue_task() for managed_node3/assert 12033 1726867205.33064: worker is 1 (out of 1 available) 12033 1726867205.33080: exiting _queue_task() for managed_node3/assert 12033 1726867205.33093: done queuing things up, now waiting for results queue to drain 12033 1726867205.33095: waiting for pending results... 12033 1726867205.33281: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' 12033 1726867205.33365: in run() - task 0affcac9-a3a5-74bb-502b-00000000097f 12033 1726867205.33379: variable 'ansible_search_path' from source: unknown 12033 1726867205.33382: variable 'ansible_search_path' from source: unknown 12033 1726867205.33414: calling self._execute() 12033 1726867205.33490: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867205.33494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867205.33505: variable 'omit' from source: magic vars 12033 1726867205.33788: variable 'ansible_distribution_major_version' from source: facts 12033 1726867205.33798: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867205.33805: variable 'omit' from source: magic vars 12033 1726867205.33845: variable 'omit' from source: magic vars 12033 1726867205.33916: variable 'interface' from source: task vars 12033 1726867205.33920: variable 'dhcp_interface1' from source: play vars 12033 1726867205.33965: variable 'dhcp_interface1' from source: play vars 12033 1726867205.33983: variable 'omit' from source: magic vars 12033 1726867205.34014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867205.34040: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867205.34056: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867205.34069: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867205.34081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867205.34108: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867205.34112: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867205.34114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867205.34180: Set connection var ansible_pipelining to False 12033 1726867205.34187: Set connection var ansible_shell_executable to /bin/sh 12033 1726867205.34203: Set connection var ansible_timeout to 10 12033 1726867205.34206: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867205.34208: Set connection var ansible_connection to ssh 12033 1726867205.34210: Set connection var ansible_shell_type to sh 12033 1726867205.34224: variable 'ansible_shell_executable' from source: unknown 12033 1726867205.34227: variable 'ansible_connection' from source: unknown 12033 1726867205.34229: variable 'ansible_module_compression' from source: unknown 12033 1726867205.34232: variable 'ansible_shell_type' from source: unknown 12033 1726867205.34234: variable 'ansible_shell_executable' from source: unknown 12033 1726867205.34236: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867205.34240: variable 'ansible_pipelining' from source: unknown 12033 1726867205.34243: variable 'ansible_timeout' from source: unknown 12033 1726867205.34247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867205.34366: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867205.34399: variable 'omit' from source: magic vars 12033 1726867205.34402: starting attempt loop 12033 1726867205.34405: running the handler 12033 1726867205.34582: variable 'interface_stat' from source: set_fact 12033 1726867205.34585: Evaluated conditional (interface_stat.stat.exists): True 12033 1726867205.34588: handler run complete 12033 1726867205.34590: attempt loop complete, returning result 12033 1726867205.34591: _execute() done 12033 1726867205.34593: dumping result to json 12033 1726867205.34595: done dumping result, returning 12033 1726867205.34597: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' [0affcac9-a3a5-74bb-502b-00000000097f] 12033 1726867205.34599: sending task result for task 0affcac9-a3a5-74bb-502b-00000000097f 12033 1726867205.34676: done sending task result for task 0affcac9-a3a5-74bb-502b-00000000097f 12033 1726867205.34682: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12033 1726867205.34920: no more pending results, returning what we have 12033 1726867205.34924: results queue empty 12033 1726867205.34925: checking for any_errors_fatal 12033 1726867205.34932: done checking for any_errors_fatal 12033 1726867205.34933: checking for max_fail_percentage 12033 1726867205.34935: done checking for max_fail_percentage 12033 1726867205.34936: checking to see if all hosts have failed and the running result is not ok 12033 1726867205.34937: done checking to see if all hosts have failed 12033 1726867205.34938: getting the remaining hosts for this loop 12033 1726867205.34940: done getting the remaining hosts for this loop 12033 1726867205.34943: getting the next task for host managed_node3 12033 1726867205.34952: done getting next task for host managed_node3 12033 1726867205.34955: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12033 1726867205.34959: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867205.34963: getting variables 12033 1726867205.34965: in VariableManager get_vars() 12033 1726867205.35003: Calling all_inventory to load vars for managed_node3 12033 1726867205.35006: Calling groups_inventory to load vars for managed_node3 12033 1726867205.35009: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867205.35020: Calling all_plugins_play to load vars for managed_node3 12033 1726867205.35023: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867205.35026: Calling groups_plugins_play to load vars for managed_node3 12033 1726867205.36584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867205.37439: done with get_vars() 12033 1726867205.37453: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:20:05 -0400 (0:00:00.046) 0:00:44.491 ****** 12033 1726867205.37527: entering _queue_task() for managed_node3/include_tasks 12033 1726867205.37736: worker is 1 (out of 1 available) 12033 1726867205.37750: exiting _queue_task() for managed_node3/include_tasks 12033 1726867205.37762: done queuing things up, now waiting for results queue to drain 12033 1726867205.37764: waiting for pending results... 12033 1726867205.38118: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 12033 1726867205.38172: in run() - task 0affcac9-a3a5-74bb-502b-000000000983 12033 1726867205.38195: variable 'ansible_search_path' from source: unknown 12033 1726867205.38203: variable 'ansible_search_path' from source: unknown 12033 1726867205.38245: calling self._execute() 12033 1726867205.38344: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867205.38356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867205.38370: variable 'omit' from source: magic vars 12033 1726867205.39084: variable 'ansible_distribution_major_version' from source: facts 12033 1726867205.39102: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867205.39283: _execute() done 12033 1726867205.39287: dumping result to json 12033 1726867205.39290: done dumping result, returning 12033 1726867205.39293: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-74bb-502b-000000000983] 12033 1726867205.39296: sending task result for task 0affcac9-a3a5-74bb-502b-000000000983 12033 1726867205.39365: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000983 12033 1726867205.39368: WORKER PROCESS EXITING 12033 1726867205.39420: no more pending results, returning what we have 12033 1726867205.39425: in VariableManager get_vars() 12033 1726867205.39471: Calling all_inventory to load vars for managed_node3 12033 1726867205.39474: Calling groups_inventory to load vars for managed_node3 12033 1726867205.39476: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867205.39680: Calling all_plugins_play to load vars for managed_node3 12033 1726867205.39684: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867205.39688: Calling groups_plugins_play to load vars for managed_node3 12033 1726867205.42130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867205.45495: done with get_vars() 12033 1726867205.45518: variable 'ansible_search_path' from source: unknown 12033 1726867205.45520: variable 'ansible_search_path' from source: unknown 12033 1726867205.45673: we have included files to process 12033 1726867205.45674: generating all_blocks data 12033 1726867205.45676: done generating all_blocks data 12033 1726867205.45682: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12033 1726867205.45683: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12033 1726867205.45686: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12033 1726867205.46029: done processing included file 12033 1726867205.46031: iterating over new_blocks loaded from include file 12033 1726867205.46033: in VariableManager get_vars() 12033 1726867205.46054: done with get_vars() 12033 1726867205.46056: filtering new block on tags 12033 1726867205.46103: done filtering new block on tags 12033 1726867205.46106: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 12033 1726867205.46111: extending task lists for all hosts with included blocks 12033 1726867205.46359: done extending task lists 12033 1726867205.46360: done processing included files 12033 1726867205.46361: results queue empty 12033 1726867205.46362: checking for any_errors_fatal 12033 1726867205.46366: done checking for any_errors_fatal 12033 1726867205.46366: checking for max_fail_percentage 12033 1726867205.46368: done checking for max_fail_percentage 12033 1726867205.46368: checking to see if all hosts have failed and the running result is not ok 12033 1726867205.46369: done checking to see if all hosts have failed 12033 1726867205.46370: getting the remaining hosts for this loop 12033 1726867205.46371: done getting the remaining hosts for this loop 12033 1726867205.46374: getting the next task for host managed_node3 12033 1726867205.46381: done getting next task for host managed_node3 12033 1726867205.46383: ^ task is: TASK: Get stat for interface {{ interface }} 12033 1726867205.46387: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867205.46389: getting variables 12033 1726867205.46390: in VariableManager get_vars() 12033 1726867205.46410: Calling all_inventory to load vars for managed_node3 12033 1726867205.46413: Calling groups_inventory to load vars for managed_node3 12033 1726867205.46419: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867205.46425: Calling all_plugins_play to load vars for managed_node3 12033 1726867205.46428: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867205.46430: Calling groups_plugins_play to load vars for managed_node3 12033 1726867205.47829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867205.50790: done with get_vars() 12033 1726867205.50813: done getting variables 12033 1726867205.50973: variable 'interface' from source: task vars 12033 1726867205.50976: variable 'dhcp_interface2' from source: play vars 12033 1726867205.51166: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:20:05 -0400 (0:00:00.136) 0:00:44.627 ****** 12033 1726867205.51206: entering _queue_task() for managed_node3/stat 12033 1726867205.51951: worker is 1 (out of 1 available) 12033 1726867205.51964: exiting _queue_task() for managed_node3/stat 12033 1726867205.52086: done queuing things up, now waiting for results queue to drain 12033 1726867205.52088: waiting for pending results... 12033 1726867205.52595: running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 12033 1726867205.52600: in run() - task 0affcac9-a3a5-74bb-502b-000000000a01 12033 1726867205.52983: variable 'ansible_search_path' from source: unknown 12033 1726867205.52987: variable 'ansible_search_path' from source: unknown 12033 1726867205.52990: calling self._execute() 12033 1726867205.52993: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867205.52996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867205.52999: variable 'omit' from source: magic vars 12033 1726867205.53672: variable 'ansible_distribution_major_version' from source: facts 12033 1726867205.53800: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867205.53814: variable 'omit' from source: magic vars 12033 1726867205.53992: variable 'omit' from source: magic vars 12033 1726867205.54095: variable 'interface' from source: task vars 12033 1726867205.54190: variable 'dhcp_interface2' from source: play vars 12033 1726867205.54255: variable 'dhcp_interface2' from source: play vars 12033 1726867205.54375: variable 'omit' from source: magic vars 12033 1726867205.54509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867205.54551: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867205.54885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867205.54888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867205.54891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867205.54894: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867205.54896: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867205.54898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867205.55015: Set connection var ansible_pipelining to False 12033 1726867205.55151: Set connection var ansible_shell_executable to /bin/sh 12033 1726867205.55154: Set connection var ansible_timeout to 10 12033 1726867205.55156: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867205.55158: Set connection var ansible_connection to ssh 12033 1726867205.55161: Set connection var ansible_shell_type to sh 12033 1726867205.55163: variable 'ansible_shell_executable' from source: unknown 12033 1726867205.55165: variable 'ansible_connection' from source: unknown 12033 1726867205.55167: variable 'ansible_module_compression' from source: unknown 12033 1726867205.55169: variable 'ansible_shell_type' from source: unknown 12033 1726867205.55171: variable 'ansible_shell_executable' from source: unknown 12033 1726867205.55172: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867205.55274: variable 'ansible_pipelining' from source: unknown 12033 1726867205.55279: variable 'ansible_timeout' from source: unknown 12033 1726867205.55282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867205.55984: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867205.55989: variable 'omit' from source: magic vars 12033 1726867205.55991: starting attempt loop 12033 1726867205.55993: running the handler 12033 1726867205.55995: _low_level_execute_command(): starting 12033 1726867205.55998: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867205.57348: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867205.57376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867205.57384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867205.57443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867205.57476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867205.57524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867205.59244: stdout chunk (state=3): >>>/root <<< 12033 1726867205.59529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867205.59664: stderr chunk (state=3): >>><<< 12033 1726867205.59667: stdout chunk (state=3): >>><<< 12033 1726867205.59671: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867205.59673: _low_level_execute_command(): starting 12033 1726867205.59676: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867205.5957575-14245-78328528186664 `" && echo ansible-tmp-1726867205.5957575-14245-78328528186664="` echo /root/.ansible/tmp/ansible-tmp-1726867205.5957575-14245-78328528186664 `" ) && sleep 0' 12033 1726867205.60541: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867205.60555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867205.60631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867205.60687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867205.60712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867205.60734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867205.60810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867205.63290: stdout chunk (state=3): >>>ansible-tmp-1726867205.5957575-14245-78328528186664=/root/.ansible/tmp/ansible-tmp-1726867205.5957575-14245-78328528186664 <<< 12033 1726867205.63296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867205.63299: stdout chunk (state=3): >>><<< 12033 1726867205.63301: stderr chunk (state=3): >>><<< 12033 1726867205.63303: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867205.5957575-14245-78328528186664=/root/.ansible/tmp/ansible-tmp-1726867205.5957575-14245-78328528186664 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867205.63305: variable 'ansible_module_compression' from source: unknown 12033 1726867205.63307: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12033 1726867205.63308: variable 'ansible_facts' from source: unknown 12033 1726867205.63499: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867205.5957575-14245-78328528186664/AnsiballZ_stat.py 12033 1726867205.63760: Sending initial data 12033 1726867205.63770: Sent initial data (152 bytes) 12033 1726867205.64365: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867205.64383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867205.64403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867205.64422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867205.64440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867205.64540: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867205.64559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867205.64625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867205.66225: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867205.66264: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867205.66317: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmp64glv2dl /root/.ansible/tmp/ansible-tmp-1726867205.5957575-14245-78328528186664/AnsiballZ_stat.py <<< 12033 1726867205.66321: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867205.5957575-14245-78328528186664/AnsiballZ_stat.py" <<< 12033 1726867205.66385: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmp64glv2dl" to remote "/root/.ansible/tmp/ansible-tmp-1726867205.5957575-14245-78328528186664/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867205.5957575-14245-78328528186664/AnsiballZ_stat.py" <<< 12033 1726867205.67740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867205.67756: stdout chunk (state=3): >>><<< 12033 1726867205.67769: stderr chunk (state=3): >>><<< 12033 1726867205.67954: done transferring module to remote 12033 1726867205.67957: _low_level_execute_command(): starting 12033 1726867205.67959: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867205.5957575-14245-78328528186664/ /root/.ansible/tmp/ansible-tmp-1726867205.5957575-14245-78328528186664/AnsiballZ_stat.py && sleep 0' 12033 1726867205.68674: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867205.68692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867205.68706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867205.68724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867205.68745: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867205.68838: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867205.68858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867205.68873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867205.68954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867205.70887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867205.70891: stdout chunk (state=3): >>><<< 12033 1726867205.70894: stderr chunk (state=3): >>><<< 12033 1726867205.70896: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867205.70898: _low_level_execute_command(): starting 12033 1726867205.70901: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867205.5957575-14245-78328528186664/AnsiballZ_stat.py && sleep 0' 12033 1726867205.71449: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867205.71464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867205.71476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867205.71490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867205.71534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867205.71551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867205.71603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867205.86827: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29174, "dev": 23, "nlink": 1, "atime": 1726867203.5199232, "mtime": 1726867203.5199232, "ctime": 1726867203.5199232, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12033 1726867205.88160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867205.88189: stderr chunk (state=3): >>><<< 12033 1726867205.88193: stdout chunk (state=3): >>><<< 12033 1726867205.88235: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29174, "dev": 23, "nlink": 1, "atime": 1726867203.5199232, "mtime": 1726867203.5199232, "ctime": 1726867203.5199232, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867205.88265: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867205.5957575-14245-78328528186664/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867205.88278: _low_level_execute_command(): starting 12033 1726867205.88283: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867205.5957575-14245-78328528186664/ > /dev/null 2>&1 && sleep 0' 12033 1726867205.88739: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867205.88742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867205.88751: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867205.88754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867205.88756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867205.88795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867205.88799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867205.88862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867205.90688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867205.90713: stderr chunk (state=3): >>><<< 12033 1726867205.90716: stdout chunk (state=3): >>><<< 12033 1726867205.90730: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867205.90735: handler run complete 12033 1726867205.90767: attempt loop complete, returning result 12033 1726867205.90770: _execute() done 12033 1726867205.90773: dumping result to json 12033 1726867205.90775: done dumping result, returning 12033 1726867205.90784: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 [0affcac9-a3a5-74bb-502b-000000000a01] 12033 1726867205.90788: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a01 12033 1726867205.90899: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a01 12033 1726867205.90901: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726867203.5199232, "block_size": 4096, "blocks": 0, "ctime": 1726867203.5199232, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 29174, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726867203.5199232, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12033 1726867205.91011: no more pending results, returning what we have 12033 1726867205.91015: results queue empty 12033 1726867205.91016: checking for any_errors_fatal 12033 1726867205.91017: done checking for any_errors_fatal 12033 1726867205.91018: checking for max_fail_percentage 12033 1726867205.91019: done checking for max_fail_percentage 12033 1726867205.91020: checking to see if all hosts have failed and the running result is not ok 12033 1726867205.91021: done checking to see if all hosts have failed 12033 1726867205.91022: getting the remaining hosts for this loop 12033 1726867205.91024: done getting the remaining hosts for this loop 12033 1726867205.91029: getting the next task for host managed_node3 12033 1726867205.91038: done getting next task for host managed_node3 12033 1726867205.91040: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12033 1726867205.91044: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867205.91048: getting variables 12033 1726867205.91049: in VariableManager get_vars() 12033 1726867205.91088: Calling all_inventory to load vars for managed_node3 12033 1726867205.91091: Calling groups_inventory to load vars for managed_node3 12033 1726867205.91095: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867205.91104: Calling all_plugins_play to load vars for managed_node3 12033 1726867205.91106: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867205.91109: Calling groups_plugins_play to load vars for managed_node3 12033 1726867205.91891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867205.93487: done with get_vars() 12033 1726867205.93509: done getting variables 12033 1726867205.93567: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867205.93687: variable 'interface' from source: task vars 12033 1726867205.93691: variable 'dhcp_interface2' from source: play vars 12033 1726867205.93753: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:20:05 -0400 (0:00:00.425) 0:00:45.053 ****** 12033 1726867205.93791: entering _queue_task() for managed_node3/assert 12033 1726867205.94120: worker is 1 (out of 1 available) 12033 1726867205.94131: exiting _queue_task() for managed_node3/assert 12033 1726867205.94142: done queuing things up, now waiting for results queue to drain 12033 1726867205.94143: waiting for pending results... 12033 1726867205.94507: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' 12033 1726867205.94598: in run() - task 0affcac9-a3a5-74bb-502b-000000000984 12033 1726867205.94621: variable 'ansible_search_path' from source: unknown 12033 1726867205.94782: variable 'ansible_search_path' from source: unknown 12033 1726867205.94786: calling self._execute() 12033 1726867205.94789: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867205.94791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867205.94796: variable 'omit' from source: magic vars 12033 1726867205.95160: variable 'ansible_distribution_major_version' from source: facts 12033 1726867205.95179: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867205.95190: variable 'omit' from source: magic vars 12033 1726867205.95253: variable 'omit' from source: magic vars 12033 1726867205.95358: variable 'interface' from source: task vars 12033 1726867205.95368: variable 'dhcp_interface2' from source: play vars 12033 1726867205.95436: variable 'dhcp_interface2' from source: play vars 12033 1726867205.95467: variable 'omit' from source: magic vars 12033 1726867205.95514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867205.95553: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867205.95670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867205.95673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867205.95676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867205.95680: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867205.95682: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867205.95684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867205.95766: Set connection var ansible_pipelining to False 12033 1726867205.95786: Set connection var ansible_shell_executable to /bin/sh 12033 1726867205.95801: Set connection var ansible_timeout to 10 12033 1726867205.95810: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867205.95815: Set connection var ansible_connection to ssh 12033 1726867205.95824: Set connection var ansible_shell_type to sh 12033 1726867205.95850: variable 'ansible_shell_executable' from source: unknown 12033 1726867205.95859: variable 'ansible_connection' from source: unknown 12033 1726867205.95867: variable 'ansible_module_compression' from source: unknown 12033 1726867205.95883: variable 'ansible_shell_type' from source: unknown 12033 1726867205.95885: variable 'ansible_shell_executable' from source: unknown 12033 1726867205.95982: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867205.95985: variable 'ansible_pipelining' from source: unknown 12033 1726867205.95989: variable 'ansible_timeout' from source: unknown 12033 1726867205.95992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867205.96059: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867205.96076: variable 'omit' from source: magic vars 12033 1726867205.96089: starting attempt loop 12033 1726867205.96100: running the handler 12033 1726867205.96226: variable 'interface_stat' from source: set_fact 12033 1726867205.96326: Evaluated conditional (interface_stat.stat.exists): True 12033 1726867205.96330: handler run complete 12033 1726867205.96333: attempt loop complete, returning result 12033 1726867205.96335: _execute() done 12033 1726867205.96337: dumping result to json 12033 1726867205.96340: done dumping result, returning 12033 1726867205.96342: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' [0affcac9-a3a5-74bb-502b-000000000984] 12033 1726867205.96344: sending task result for task 0affcac9-a3a5-74bb-502b-000000000984 12033 1726867205.96419: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000984 12033 1726867205.96422: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 12033 1726867205.96482: no more pending results, returning what we have 12033 1726867205.96486: results queue empty 12033 1726867205.96487: checking for any_errors_fatal 12033 1726867205.96500: done checking for any_errors_fatal 12033 1726867205.96501: checking for max_fail_percentage 12033 1726867205.96504: done checking for max_fail_percentage 12033 1726867205.96505: checking to see if all hosts have failed and the running result is not ok 12033 1726867205.96506: done checking to see if all hosts have failed 12033 1726867205.96507: getting the remaining hosts for this loop 12033 1726867205.96509: done getting the remaining hosts for this loop 12033 1726867205.96513: getting the next task for host managed_node3 12033 1726867205.96544: done getting next task for host managed_node3 12033 1726867205.96547: ^ task is: TASK: Test 12033 1726867205.96551: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867205.96556: getting variables 12033 1726867205.96558: in VariableManager get_vars() 12033 1726867205.96607: Calling all_inventory to load vars for managed_node3 12033 1726867205.96610: Calling groups_inventory to load vars for managed_node3 12033 1726867205.96613: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867205.96625: Calling all_plugins_play to load vars for managed_node3 12033 1726867205.96628: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867205.96632: Calling groups_plugins_play to load vars for managed_node3 12033 1726867205.98581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867206.00195: done with get_vars() 12033 1726867206.00220: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 17:20:06 -0400 (0:00:00.065) 0:00:45.118 ****** 12033 1726867206.00308: entering _queue_task() for managed_node3/include_tasks 12033 1726867206.00602: worker is 1 (out of 1 available) 12033 1726867206.00614: exiting _queue_task() for managed_node3/include_tasks 12033 1726867206.00626: done queuing things up, now waiting for results queue to drain 12033 1726867206.00627: waiting for pending results... 12033 1726867206.00998: running TaskExecutor() for managed_node3/TASK: Test 12033 1726867206.01005: in run() - task 0affcac9-a3a5-74bb-502b-0000000008ee 12033 1726867206.01015: variable 'ansible_search_path' from source: unknown 12033 1726867206.01018: variable 'ansible_search_path' from source: unknown 12033 1726867206.01059: variable 'lsr_test' from source: include params 12033 1726867206.01258: variable 'lsr_test' from source: include params 12033 1726867206.01328: variable 'omit' from source: magic vars 12033 1726867206.01582: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867206.01586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867206.01589: variable 'omit' from source: magic vars 12033 1726867206.01730: variable 'ansible_distribution_major_version' from source: facts 12033 1726867206.01739: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867206.01745: variable 'item' from source: unknown 12033 1726867206.01813: variable 'item' from source: unknown 12033 1726867206.01842: variable 'item' from source: unknown 12033 1726867206.01911: variable 'item' from source: unknown 12033 1726867206.02033: dumping result to json 12033 1726867206.02036: done dumping result, returning 12033 1726867206.02040: done running TaskExecutor() for managed_node3/TASK: Test [0affcac9-a3a5-74bb-502b-0000000008ee] 12033 1726867206.02043: sending task result for task 0affcac9-a3a5-74bb-502b-0000000008ee 12033 1726867206.02308: no more pending results, returning what we have 12033 1726867206.02312: in VariableManager get_vars() 12033 1726867206.02350: Calling all_inventory to load vars for managed_node3 12033 1726867206.02353: Calling groups_inventory to load vars for managed_node3 12033 1726867206.02355: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867206.02364: Calling all_plugins_play to load vars for managed_node3 12033 1726867206.02367: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867206.02370: Calling groups_plugins_play to load vars for managed_node3 12033 1726867206.02993: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000008ee 12033 1726867206.02997: WORKER PROCESS EXITING 12033 1726867206.03886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867206.05550: done with get_vars() 12033 1726867206.05570: variable 'ansible_search_path' from source: unknown 12033 1726867206.05571: variable 'ansible_search_path' from source: unknown 12033 1726867206.05613: we have included files to process 12033 1726867206.05615: generating all_blocks data 12033 1726867206.05616: done generating all_blocks data 12033 1726867206.05621: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 12033 1726867206.05623: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 12033 1726867206.05625: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 12033 1726867206.05882: in VariableManager get_vars() 12033 1726867206.05903: done with get_vars() 12033 1726867206.05909: variable 'omit' from source: magic vars 12033 1726867206.05946: variable 'omit' from source: magic vars 12033 1726867206.06002: in VariableManager get_vars() 12033 1726867206.06016: done with get_vars() 12033 1726867206.06041: in VariableManager get_vars() 12033 1726867206.06057: done with get_vars() 12033 1726867206.06096: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12033 1726867206.06246: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12033 1726867206.06331: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12033 1726867206.06711: in VariableManager get_vars() 12033 1726867206.06736: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12033 1726867206.08834: done processing included file 12033 1726867206.08836: iterating over new_blocks loaded from include file 12033 1726867206.08838: in VariableManager get_vars() 12033 1726867206.08870: done with get_vars() 12033 1726867206.08872: filtering new block on tags 12033 1726867206.09185: done filtering new block on tags 12033 1726867206.09189: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml for managed_node3 => (item=tasks/create_bond_profile_reconfigure.yml) 12033 1726867206.09195: extending task lists for all hosts with included blocks 12033 1726867206.10504: done extending task lists 12033 1726867206.10506: done processing included files 12033 1726867206.10506: results queue empty 12033 1726867206.10507: checking for any_errors_fatal 12033 1726867206.10510: done checking for any_errors_fatal 12033 1726867206.10511: checking for max_fail_percentage 12033 1726867206.10512: done checking for max_fail_percentage 12033 1726867206.10513: checking to see if all hosts have failed and the running result is not ok 12033 1726867206.10514: done checking to see if all hosts have failed 12033 1726867206.10514: getting the remaining hosts for this loop 12033 1726867206.10516: done getting the remaining hosts for this loop 12033 1726867206.10522: getting the next task for host managed_node3 12033 1726867206.10528: done getting next task for host managed_node3 12033 1726867206.10531: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12033 1726867206.10534: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867206.10544: getting variables 12033 1726867206.10545: in VariableManager get_vars() 12033 1726867206.10559: Calling all_inventory to load vars for managed_node3 12033 1726867206.10561: Calling groups_inventory to load vars for managed_node3 12033 1726867206.10563: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867206.10568: Calling all_plugins_play to load vars for managed_node3 12033 1726867206.10570: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867206.10573: Calling groups_plugins_play to load vars for managed_node3 12033 1726867206.11840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867206.13382: done with get_vars() 12033 1726867206.13403: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:20:06 -0400 (0:00:00.131) 0:00:45.250 ****** 12033 1726867206.13478: entering _queue_task() for managed_node3/include_tasks 12033 1726867206.13906: worker is 1 (out of 1 available) 12033 1726867206.13919: exiting _queue_task() for managed_node3/include_tasks 12033 1726867206.14045: done queuing things up, now waiting for results queue to drain 12033 1726867206.14046: waiting for pending results... 12033 1726867206.14173: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12033 1726867206.14331: in run() - task 0affcac9-a3a5-74bb-502b-000000000a2e 12033 1726867206.14353: variable 'ansible_search_path' from source: unknown 12033 1726867206.14367: variable 'ansible_search_path' from source: unknown 12033 1726867206.14412: calling self._execute() 12033 1726867206.14524: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867206.14535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867206.14582: variable 'omit' from source: magic vars 12033 1726867206.14934: variable 'ansible_distribution_major_version' from source: facts 12033 1726867206.14956: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867206.14966: _execute() done 12033 1726867206.14973: dumping result to json 12033 1726867206.14982: done dumping result, returning 12033 1726867206.15016: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-74bb-502b-000000000a2e] 12033 1726867206.15019: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a2e 12033 1726867206.15232: no more pending results, returning what we have 12033 1726867206.15237: in VariableManager get_vars() 12033 1726867206.15289: Calling all_inventory to load vars for managed_node3 12033 1726867206.15292: Calling groups_inventory to load vars for managed_node3 12033 1726867206.15294: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867206.15307: Calling all_plugins_play to load vars for managed_node3 12033 1726867206.15310: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867206.15313: Calling groups_plugins_play to load vars for managed_node3 12033 1726867206.15940: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a2e 12033 1726867206.15945: WORKER PROCESS EXITING 12033 1726867206.16881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867206.18483: done with get_vars() 12033 1726867206.18500: variable 'ansible_search_path' from source: unknown 12033 1726867206.18501: variable 'ansible_search_path' from source: unknown 12033 1726867206.18543: we have included files to process 12033 1726867206.18544: generating all_blocks data 12033 1726867206.18546: done generating all_blocks data 12033 1726867206.18548: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12033 1726867206.18549: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12033 1726867206.18551: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12033 1726867206.19129: done processing included file 12033 1726867206.19132: iterating over new_blocks loaded from include file 12033 1726867206.19133: in VariableManager get_vars() 12033 1726867206.19161: done with get_vars() 12033 1726867206.19163: filtering new block on tags 12033 1726867206.19198: done filtering new block on tags 12033 1726867206.19202: in VariableManager get_vars() 12033 1726867206.19227: done with get_vars() 12033 1726867206.19229: filtering new block on tags 12033 1726867206.19274: done filtering new block on tags 12033 1726867206.19279: in VariableManager get_vars() 12033 1726867206.19309: done with get_vars() 12033 1726867206.19311: filtering new block on tags 12033 1726867206.19354: done filtering new block on tags 12033 1726867206.19356: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 12033 1726867206.19361: extending task lists for all hosts with included blocks 12033 1726867206.21085: done extending task lists 12033 1726867206.21087: done processing included files 12033 1726867206.21088: results queue empty 12033 1726867206.21088: checking for any_errors_fatal 12033 1726867206.21092: done checking for any_errors_fatal 12033 1726867206.21093: checking for max_fail_percentage 12033 1726867206.21094: done checking for max_fail_percentage 12033 1726867206.21095: checking to see if all hosts have failed and the running result is not ok 12033 1726867206.21096: done checking to see if all hosts have failed 12033 1726867206.21097: getting the remaining hosts for this loop 12033 1726867206.21098: done getting the remaining hosts for this loop 12033 1726867206.21100: getting the next task for host managed_node3 12033 1726867206.21105: done getting next task for host managed_node3 12033 1726867206.21108: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12033 1726867206.21112: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867206.21122: getting variables 12033 1726867206.21123: in VariableManager get_vars() 12033 1726867206.21144: Calling all_inventory to load vars for managed_node3 12033 1726867206.21146: Calling groups_inventory to load vars for managed_node3 12033 1726867206.21148: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867206.21154: Calling all_plugins_play to load vars for managed_node3 12033 1726867206.21156: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867206.21159: Calling groups_plugins_play to load vars for managed_node3 12033 1726867206.22564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867206.25256: done with get_vars() 12033 1726867206.25341: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:20:06 -0400 (0:00:00.119) 0:00:45.370 ****** 12033 1726867206.25435: entering _queue_task() for managed_node3/setup 12033 1726867206.26295: worker is 1 (out of 1 available) 12033 1726867206.26313: exiting _queue_task() for managed_node3/setup 12033 1726867206.26325: done queuing things up, now waiting for results queue to drain 12033 1726867206.26326: waiting for pending results... 12033 1726867206.26652: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12033 1726867206.27006: in run() - task 0affcac9-a3a5-74bb-502b-000000000b10 12033 1726867206.27029: variable 'ansible_search_path' from source: unknown 12033 1726867206.27033: variable 'ansible_search_path' from source: unknown 12033 1726867206.27069: calling self._execute() 12033 1726867206.27139: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867206.27143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867206.27178: variable 'omit' from source: magic vars 12033 1726867206.27900: variable 'ansible_distribution_major_version' from source: facts 12033 1726867206.27935: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867206.28332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867206.31286: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867206.31546: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867206.31831: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867206.31835: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867206.31837: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867206.31910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867206.32017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867206.32187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867206.32239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867206.32263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867206.32384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867206.32474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867206.32510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867206.32608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867206.32681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867206.33000: variable '__network_required_facts' from source: role '' defaults 12033 1726867206.33072: variable 'ansible_facts' from source: unknown 12033 1726867206.34138: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12033 1726867206.34150: when evaluation is False, skipping this task 12033 1726867206.34158: _execute() done 12033 1726867206.34191: dumping result to json 12033 1726867206.34194: done dumping result, returning 12033 1726867206.34199: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-74bb-502b-000000000b10] 12033 1726867206.34201: sending task result for task 0affcac9-a3a5-74bb-502b-000000000b10 12033 1726867206.34583: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000b10 12033 1726867206.34587: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867206.34627: no more pending results, returning what we have 12033 1726867206.34630: results queue empty 12033 1726867206.34631: checking for any_errors_fatal 12033 1726867206.34633: done checking for any_errors_fatal 12033 1726867206.34634: checking for max_fail_percentage 12033 1726867206.34635: done checking for max_fail_percentage 12033 1726867206.34636: checking to see if all hosts have failed and the running result is not ok 12033 1726867206.34637: done checking to see if all hosts have failed 12033 1726867206.34637: getting the remaining hosts for this loop 12033 1726867206.34639: done getting the remaining hosts for this loop 12033 1726867206.34642: getting the next task for host managed_node3 12033 1726867206.34652: done getting next task for host managed_node3 12033 1726867206.34655: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12033 1726867206.34660: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867206.34681: getting variables 12033 1726867206.34682: in VariableManager get_vars() 12033 1726867206.34727: Calling all_inventory to load vars for managed_node3 12033 1726867206.34730: Calling groups_inventory to load vars for managed_node3 12033 1726867206.34732: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867206.34741: Calling all_plugins_play to load vars for managed_node3 12033 1726867206.34744: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867206.34752: Calling groups_plugins_play to load vars for managed_node3 12033 1726867206.36153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867206.37973: done with get_vars() 12033 1726867206.37998: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:20:06 -0400 (0:00:00.126) 0:00:45.496 ****** 12033 1726867206.38101: entering _queue_task() for managed_node3/stat 12033 1726867206.38617: worker is 1 (out of 1 available) 12033 1726867206.38634: exiting _queue_task() for managed_node3/stat 12033 1726867206.38647: done queuing things up, now waiting for results queue to drain 12033 1726867206.38649: waiting for pending results... 12033 1726867206.39156: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 12033 1726867206.39622: in run() - task 0affcac9-a3a5-74bb-502b-000000000b12 12033 1726867206.39637: variable 'ansible_search_path' from source: unknown 12033 1726867206.39641: variable 'ansible_search_path' from source: unknown 12033 1726867206.39676: calling self._execute() 12033 1726867206.39935: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867206.40058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867206.40066: variable 'omit' from source: magic vars 12033 1726867206.41067: variable 'ansible_distribution_major_version' from source: facts 12033 1726867206.41071: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867206.41176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867206.41439: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867206.41489: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867206.41533: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867206.41568: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867206.41660: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867206.41691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867206.41729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867206.41759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867206.41858: variable '__network_is_ostree' from source: set_fact 12033 1726867206.41870: Evaluated conditional (not __network_is_ostree is defined): False 12033 1726867206.41880: when evaluation is False, skipping this task 12033 1726867206.41888: _execute() done 12033 1726867206.41894: dumping result to json 12033 1726867206.41936: done dumping result, returning 12033 1726867206.41939: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-74bb-502b-000000000b12] 12033 1726867206.41941: sending task result for task 0affcac9-a3a5-74bb-502b-000000000b12 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12033 1726867206.42092: no more pending results, returning what we have 12033 1726867206.42098: results queue empty 12033 1726867206.42099: checking for any_errors_fatal 12033 1726867206.42107: done checking for any_errors_fatal 12033 1726867206.42108: checking for max_fail_percentage 12033 1726867206.42109: done checking for max_fail_percentage 12033 1726867206.42110: checking to see if all hosts have failed and the running result is not ok 12033 1726867206.42111: done checking to see if all hosts have failed 12033 1726867206.42112: getting the remaining hosts for this loop 12033 1726867206.42114: done getting the remaining hosts for this loop 12033 1726867206.42118: getting the next task for host managed_node3 12033 1726867206.42125: done getting next task for host managed_node3 12033 1726867206.42128: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12033 1726867206.42133: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867206.42153: getting variables 12033 1726867206.42154: in VariableManager get_vars() 12033 1726867206.42268: Calling all_inventory to load vars for managed_node3 12033 1726867206.42272: Calling groups_inventory to load vars for managed_node3 12033 1726867206.42274: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867206.42584: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000b12 12033 1726867206.42588: WORKER PROCESS EXITING 12033 1726867206.42599: Calling all_plugins_play to load vars for managed_node3 12033 1726867206.42603: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867206.42606: Calling groups_plugins_play to load vars for managed_node3 12033 1726867206.45305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867206.48500: done with get_vars() 12033 1726867206.48526: done getting variables 12033 1726867206.48690: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:20:06 -0400 (0:00:00.106) 0:00:45.603 ****** 12033 1726867206.48734: entering _queue_task() for managed_node3/set_fact 12033 1726867206.49361: worker is 1 (out of 1 available) 12033 1726867206.49376: exiting _queue_task() for managed_node3/set_fact 12033 1726867206.49395: done queuing things up, now waiting for results queue to drain 12033 1726867206.49399: waiting for pending results... 12033 1726867206.49978: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12033 1726867206.50331: in run() - task 0affcac9-a3a5-74bb-502b-000000000b13 12033 1726867206.50344: variable 'ansible_search_path' from source: unknown 12033 1726867206.50348: variable 'ansible_search_path' from source: unknown 12033 1726867206.50382: calling self._execute() 12033 1726867206.50473: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867206.50483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867206.50698: variable 'omit' from source: magic vars 12033 1726867206.51238: variable 'ansible_distribution_major_version' from source: facts 12033 1726867206.51251: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867206.51642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867206.52372: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867206.52419: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867206.52456: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867206.52490: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867206.53526: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867206.53529: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867206.53554: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867206.53581: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867206.53665: variable '__network_is_ostree' from source: set_fact 12033 1726867206.53784: Evaluated conditional (not __network_is_ostree is defined): False 12033 1726867206.53787: when evaluation is False, skipping this task 12033 1726867206.53790: _execute() done 12033 1726867206.53792: dumping result to json 12033 1726867206.53888: done dumping result, returning 12033 1726867206.53900: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-74bb-502b-000000000b13] 12033 1726867206.53910: sending task result for task 0affcac9-a3a5-74bb-502b-000000000b13 12033 1726867206.54086: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000b13 12033 1726867206.54090: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12033 1726867206.54141: no more pending results, returning what we have 12033 1726867206.54146: results queue empty 12033 1726867206.54147: checking for any_errors_fatal 12033 1726867206.54156: done checking for any_errors_fatal 12033 1726867206.54156: checking for max_fail_percentage 12033 1726867206.54158: done checking for max_fail_percentage 12033 1726867206.54159: checking to see if all hosts have failed and the running result is not ok 12033 1726867206.54160: done checking to see if all hosts have failed 12033 1726867206.54161: getting the remaining hosts for this loop 12033 1726867206.54163: done getting the remaining hosts for this loop 12033 1726867206.54166: getting the next task for host managed_node3 12033 1726867206.54178: done getting next task for host managed_node3 12033 1726867206.54182: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12033 1726867206.54187: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867206.54208: getting variables 12033 1726867206.54210: in VariableManager get_vars() 12033 1726867206.54248: Calling all_inventory to load vars for managed_node3 12033 1726867206.54251: Calling groups_inventory to load vars for managed_node3 12033 1726867206.54253: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867206.54261: Calling all_plugins_play to load vars for managed_node3 12033 1726867206.54263: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867206.54266: Calling groups_plugins_play to load vars for managed_node3 12033 1726867206.56053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867206.58131: done with get_vars() 12033 1726867206.58160: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:20:06 -0400 (0:00:00.095) 0:00:45.698 ****** 12033 1726867206.58292: entering _queue_task() for managed_node3/service_facts 12033 1726867206.58710: worker is 1 (out of 1 available) 12033 1726867206.58725: exiting _queue_task() for managed_node3/service_facts 12033 1726867206.58852: done queuing things up, now waiting for results queue to drain 12033 1726867206.58854: waiting for pending results... 12033 1726867206.59021: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 12033 1726867206.59284: in run() - task 0affcac9-a3a5-74bb-502b-000000000b15 12033 1726867206.59298: variable 'ansible_search_path' from source: unknown 12033 1726867206.59302: variable 'ansible_search_path' from source: unknown 12033 1726867206.59307: calling self._execute() 12033 1726867206.59406: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867206.59417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867206.59429: variable 'omit' from source: magic vars 12033 1726867206.59799: variable 'ansible_distribution_major_version' from source: facts 12033 1726867206.59817: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867206.59839: variable 'omit' from source: magic vars 12033 1726867206.59931: variable 'omit' from source: magic vars 12033 1726867206.60058: variable 'omit' from source: magic vars 12033 1726867206.60061: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867206.60070: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867206.60114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867206.60136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867206.60152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867206.60206: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867206.60214: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867206.60221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867206.60331: Set connection var ansible_pipelining to False 12033 1726867206.60345: Set connection var ansible_shell_executable to /bin/sh 12033 1726867206.60356: Set connection var ansible_timeout to 10 12033 1726867206.60365: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867206.60371: Set connection var ansible_connection to ssh 12033 1726867206.60389: Set connection var ansible_shell_type to sh 12033 1726867206.60417: variable 'ansible_shell_executable' from source: unknown 12033 1726867206.60424: variable 'ansible_connection' from source: unknown 12033 1726867206.60482: variable 'ansible_module_compression' from source: unknown 12033 1726867206.60492: variable 'ansible_shell_type' from source: unknown 12033 1726867206.60495: variable 'ansible_shell_executable' from source: unknown 12033 1726867206.60499: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867206.60501: variable 'ansible_pipelining' from source: unknown 12033 1726867206.60503: variable 'ansible_timeout' from source: unknown 12033 1726867206.60505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867206.60676: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867206.60694: variable 'omit' from source: magic vars 12033 1726867206.60713: starting attempt loop 12033 1726867206.60720: running the handler 12033 1726867206.60736: _low_level_execute_command(): starting 12033 1726867206.60747: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867206.61609: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867206.61624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867206.61641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867206.61833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867206.63515: stdout chunk (state=3): >>>/root <<< 12033 1726867206.63661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867206.63676: stdout chunk (state=3): >>><<< 12033 1726867206.63711: stderr chunk (state=3): >>><<< 12033 1726867206.63736: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867206.63843: _low_level_execute_command(): starting 12033 1726867206.63847: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867206.6374512-14292-119062157781279 `" && echo ansible-tmp-1726867206.6374512-14292-119062157781279="` echo /root/.ansible/tmp/ansible-tmp-1726867206.6374512-14292-119062157781279 `" ) && sleep 0' 12033 1726867206.64878: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867206.64954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867206.65031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867206.66929: stdout chunk (state=3): >>>ansible-tmp-1726867206.6374512-14292-119062157781279=/root/.ansible/tmp/ansible-tmp-1726867206.6374512-14292-119062157781279 <<< 12033 1726867206.67037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867206.67095: stderr chunk (state=3): >>><<< 12033 1726867206.67106: stdout chunk (state=3): >>><<< 12033 1726867206.67134: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867206.6374512-14292-119062157781279=/root/.ansible/tmp/ansible-tmp-1726867206.6374512-14292-119062157781279 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867206.67191: variable 'ansible_module_compression' from source: unknown 12033 1726867206.67243: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 12033 1726867206.67287: variable 'ansible_facts' from source: unknown 12033 1726867206.67380: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867206.6374512-14292-119062157781279/AnsiballZ_service_facts.py 12033 1726867206.67633: Sending initial data 12033 1726867206.67637: Sent initial data (162 bytes) 12033 1726867206.68332: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867206.68365: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867206.68382: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867206.68496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867206.68574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867206.68663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867206.68689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867206.68768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867206.70316: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867206.70372: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867206.70433: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpjonr5flx /root/.ansible/tmp/ansible-tmp-1726867206.6374512-14292-119062157781279/AnsiballZ_service_facts.py <<< 12033 1726867206.70437: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867206.6374512-14292-119062157781279/AnsiballZ_service_facts.py" <<< 12033 1726867206.70514: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpjonr5flx" to remote "/root/.ansible/tmp/ansible-tmp-1726867206.6374512-14292-119062157781279/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867206.6374512-14292-119062157781279/AnsiballZ_service_facts.py" <<< 12033 1726867206.71524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867206.71528: stdout chunk (state=3): >>><<< 12033 1726867206.71530: stderr chunk (state=3): >>><<< 12033 1726867206.71544: done transferring module to remote 12033 1726867206.71565: _low_level_execute_command(): starting 12033 1726867206.71574: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867206.6374512-14292-119062157781279/ /root/.ansible/tmp/ansible-tmp-1726867206.6374512-14292-119062157781279/AnsiballZ_service_facts.py && sleep 0' 12033 1726867206.72304: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867206.72320: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867206.72334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867206.72387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867206.72462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867206.72493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867206.72518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867206.72596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867206.74388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867206.74396: stdout chunk (state=3): >>><<< 12033 1726867206.74401: stderr chunk (state=3): >>><<< 12033 1726867206.74494: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867206.74501: _low_level_execute_command(): starting 12033 1726867206.74504: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867206.6374512-14292-119062157781279/AnsiballZ_service_facts.py && sleep 0' 12033 1726867206.75049: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867206.75064: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867206.75080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867206.75103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867206.75146: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867206.75217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867206.75247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867206.75388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867208.27088: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 12033 1726867208.27141: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-<<< 12033 1726867208.27173: stdout chunk (state=3): >>>boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12033 1726867208.28671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867208.28735: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 12033 1726867208.28762: stderr chunk (state=3): >>><<< 12033 1726867208.28765: stdout chunk (state=3): >>><<< 12033 1726867208.28984: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867208.29584: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867206.6374512-14292-119062157781279/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867208.29600: _low_level_execute_command(): starting 12033 1726867208.29610: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867206.6374512-14292-119062157781279/ > /dev/null 2>&1 && sleep 0' 12033 1726867208.30266: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867208.30283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867208.30298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867208.30324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867208.30340: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867208.30434: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867208.30454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867208.30470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867208.30495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867208.30564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867208.32462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867208.32471: stdout chunk (state=3): >>><<< 12033 1726867208.32485: stderr chunk (state=3): >>><<< 12033 1726867208.32506: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867208.32519: handler run complete 12033 1726867208.32751: variable 'ansible_facts' from source: unknown 12033 1726867208.32962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867208.33449: variable 'ansible_facts' from source: unknown 12033 1726867208.33600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867208.33829: attempt loop complete, returning result 12033 1726867208.33882: _execute() done 12033 1726867208.33885: dumping result to json 12033 1726867208.33918: done dumping result, returning 12033 1726867208.33938: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-74bb-502b-000000000b15] 12033 1726867208.33955: sending task result for task 0affcac9-a3a5-74bb-502b-000000000b15 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867208.35064: no more pending results, returning what we have 12033 1726867208.35067: results queue empty 12033 1726867208.35068: checking for any_errors_fatal 12033 1726867208.35073: done checking for any_errors_fatal 12033 1726867208.35074: checking for max_fail_percentage 12033 1726867208.35075: done checking for max_fail_percentage 12033 1726867208.35076: checking to see if all hosts have failed and the running result is not ok 12033 1726867208.35079: done checking to see if all hosts have failed 12033 1726867208.35080: getting the remaining hosts for this loop 12033 1726867208.35081: done getting the remaining hosts for this loop 12033 1726867208.35084: getting the next task for host managed_node3 12033 1726867208.35090: done getting next task for host managed_node3 12033 1726867208.35093: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12033 1726867208.35105: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867208.35117: getting variables 12033 1726867208.35119: in VariableManager get_vars() 12033 1726867208.35152: Calling all_inventory to load vars for managed_node3 12033 1726867208.35155: Calling groups_inventory to load vars for managed_node3 12033 1726867208.35157: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867208.35167: Calling all_plugins_play to load vars for managed_node3 12033 1726867208.35170: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867208.35172: Calling groups_plugins_play to load vars for managed_node3 12033 1726867208.35182: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000b15 12033 1726867208.35832: WORKER PROCESS EXITING 12033 1726867208.36720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867208.38392: done with get_vars() 12033 1726867208.38414: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:20:08 -0400 (0:00:01.802) 0:00:47.500 ****** 12033 1726867208.38522: entering _queue_task() for managed_node3/package_facts 12033 1726867208.38855: worker is 1 (out of 1 available) 12033 1726867208.38868: exiting _queue_task() for managed_node3/package_facts 12033 1726867208.39013: done queuing things up, now waiting for results queue to drain 12033 1726867208.39015: waiting for pending results... 12033 1726867208.39189: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 12033 1726867208.39361: in run() - task 0affcac9-a3a5-74bb-502b-000000000b16 12033 1726867208.39383: variable 'ansible_search_path' from source: unknown 12033 1726867208.39390: variable 'ansible_search_path' from source: unknown 12033 1726867208.39428: calling self._execute() 12033 1726867208.39531: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867208.39542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867208.39582: variable 'omit' from source: magic vars 12033 1726867208.39952: variable 'ansible_distribution_major_version' from source: facts 12033 1726867208.39968: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867208.39995: variable 'omit' from source: magic vars 12033 1726867208.40095: variable 'omit' from source: magic vars 12033 1726867208.40214: variable 'omit' from source: magic vars 12033 1726867208.40217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867208.40224: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867208.40247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867208.40269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867208.40288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867208.40334: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867208.40343: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867208.40351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867208.40463: Set connection var ansible_pipelining to False 12033 1726867208.40476: Set connection var ansible_shell_executable to /bin/sh 12033 1726867208.40539: Set connection var ansible_timeout to 10 12033 1726867208.40547: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867208.40550: Set connection var ansible_connection to ssh 12033 1726867208.40552: Set connection var ansible_shell_type to sh 12033 1726867208.40554: variable 'ansible_shell_executable' from source: unknown 12033 1726867208.40556: variable 'ansible_connection' from source: unknown 12033 1726867208.40558: variable 'ansible_module_compression' from source: unknown 12033 1726867208.40563: variable 'ansible_shell_type' from source: unknown 12033 1726867208.40570: variable 'ansible_shell_executable' from source: unknown 12033 1726867208.40576: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867208.40586: variable 'ansible_pipelining' from source: unknown 12033 1726867208.40592: variable 'ansible_timeout' from source: unknown 12033 1726867208.40599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867208.40870: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867208.40879: variable 'omit' from source: magic vars 12033 1726867208.40882: starting attempt loop 12033 1726867208.40884: running the handler 12033 1726867208.40886: _low_level_execute_command(): starting 12033 1726867208.40888: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867208.41683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867208.41723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867208.41750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867208.41776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867208.41857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867208.43489: stdout chunk (state=3): >>>/root <<< 12033 1726867208.43626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867208.43629: stdout chunk (state=3): >>><<< 12033 1726867208.43630: stderr chunk (state=3): >>><<< 12033 1726867208.43642: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867208.43682: _low_level_execute_command(): starting 12033 1726867208.43686: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867208.436479-14369-160660653215305 `" && echo ansible-tmp-1726867208.436479-14369-160660653215305="` echo /root/.ansible/tmp/ansible-tmp-1726867208.436479-14369-160660653215305 `" ) && sleep 0' 12033 1726867208.44080: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867208.44084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867208.44086: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867208.44094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867208.44136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867208.44139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867208.44188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867208.46074: stdout chunk (state=3): >>>ansible-tmp-1726867208.436479-14369-160660653215305=/root/.ansible/tmp/ansible-tmp-1726867208.436479-14369-160660653215305 <<< 12033 1726867208.46289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867208.46313: stdout chunk (state=3): >>><<< 12033 1726867208.46316: stderr chunk (state=3): >>><<< 12033 1726867208.46319: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867208.436479-14369-160660653215305=/root/.ansible/tmp/ansible-tmp-1726867208.436479-14369-160660653215305 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867208.46322: variable 'ansible_module_compression' from source: unknown 12033 1726867208.46346: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 12033 1726867208.46394: variable 'ansible_facts' from source: unknown 12033 1726867208.46512: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867208.436479-14369-160660653215305/AnsiballZ_package_facts.py 12033 1726867208.46605: Sending initial data 12033 1726867208.46608: Sent initial data (161 bytes) 12033 1726867208.47054: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867208.47061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867208.47075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867208.47133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867208.48684: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867208.48722: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867208.48760: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpeat6k1o1 /root/.ansible/tmp/ansible-tmp-1726867208.436479-14369-160660653215305/AnsiballZ_package_facts.py <<< 12033 1726867208.48779: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867208.436479-14369-160660653215305/AnsiballZ_package_facts.py" <<< 12033 1726867208.48821: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpeat6k1o1" to remote "/root/.ansible/tmp/ansible-tmp-1726867208.436479-14369-160660653215305/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867208.436479-14369-160660653215305/AnsiballZ_package_facts.py" <<< 12033 1726867208.49882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867208.49980: stderr chunk (state=3): >>><<< 12033 1726867208.49983: stdout chunk (state=3): >>><<< 12033 1726867208.49985: done transferring module to remote 12033 1726867208.49988: _low_level_execute_command(): starting 12033 1726867208.49990: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867208.436479-14369-160660653215305/ /root/.ansible/tmp/ansible-tmp-1726867208.436479-14369-160660653215305/AnsiballZ_package_facts.py && sleep 0' 12033 1726867208.50338: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867208.50341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867208.50344: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867208.50347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867208.50350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867208.50394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867208.50397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867208.50445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867208.52200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867208.52221: stderr chunk (state=3): >>><<< 12033 1726867208.52224: stdout chunk (state=3): >>><<< 12033 1726867208.52235: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867208.52238: _low_level_execute_command(): starting 12033 1726867208.52241: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867208.436479-14369-160660653215305/AnsiballZ_package_facts.py && sleep 0' 12033 1726867208.52634: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867208.52637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867208.52640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867208.52642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867208.52644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867208.52685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867208.52688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867208.52741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867208.96644: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 12033 1726867208.96718: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 12033 1726867208.96794: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 12033 1726867208.96842: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 12033 1726867208.96850: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12033 1726867208.98607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867208.98610: stdout chunk (state=3): >>><<< 12033 1726867208.98613: stderr chunk (state=3): >>><<< 12033 1726867208.98790: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867209.06170: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867208.436479-14369-160660653215305/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867209.06187: _low_level_execute_command(): starting 12033 1726867209.06191: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867208.436479-14369-160660653215305/ > /dev/null 2>&1 && sleep 0' 12033 1726867209.06621: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867209.06625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867209.06650: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867209.06653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867209.06713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867209.06717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867209.06719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867209.06780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867209.08655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867209.08679: stderr chunk (state=3): >>><<< 12033 1726867209.08682: stdout chunk (state=3): >>><<< 12033 1726867209.08737: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867209.08741: handler run complete 12033 1726867209.09153: variable 'ansible_facts' from source: unknown 12033 1726867209.09403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867209.10414: variable 'ansible_facts' from source: unknown 12033 1726867209.10813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867209.11361: attempt loop complete, returning result 12033 1726867209.11372: _execute() done 12033 1726867209.11375: dumping result to json 12033 1726867209.11571: done dumping result, returning 12033 1726867209.11580: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-74bb-502b-000000000b16] 12033 1726867209.11583: sending task result for task 0affcac9-a3a5-74bb-502b-000000000b16 12033 1726867209.18896: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000b16 12033 1726867209.18902: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867209.19009: no more pending results, returning what we have 12033 1726867209.19011: results queue empty 12033 1726867209.19012: checking for any_errors_fatal 12033 1726867209.19015: done checking for any_errors_fatal 12033 1726867209.19016: checking for max_fail_percentage 12033 1726867209.19017: done checking for max_fail_percentage 12033 1726867209.19018: checking to see if all hosts have failed and the running result is not ok 12033 1726867209.19019: done checking to see if all hosts have failed 12033 1726867209.19019: getting the remaining hosts for this loop 12033 1726867209.19020: done getting the remaining hosts for this loop 12033 1726867209.19023: getting the next task for host managed_node3 12033 1726867209.19028: done getting next task for host managed_node3 12033 1726867209.19031: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12033 1726867209.19036: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867209.19045: getting variables 12033 1726867209.19046: in VariableManager get_vars() 12033 1726867209.19073: Calling all_inventory to load vars for managed_node3 12033 1726867209.19075: Calling groups_inventory to load vars for managed_node3 12033 1726867209.19079: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867209.19086: Calling all_plugins_play to load vars for managed_node3 12033 1726867209.19088: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867209.19091: Calling groups_plugins_play to load vars for managed_node3 12033 1726867209.20291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867209.22058: done with get_vars() 12033 1726867209.22081: done getting variables 12033 1726867209.22140: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:20:09 -0400 (0:00:00.836) 0:00:48.337 ****** 12033 1726867209.22170: entering _queue_task() for managed_node3/debug 12033 1726867209.22617: worker is 1 (out of 1 available) 12033 1726867209.22630: exiting _queue_task() for managed_node3/debug 12033 1726867209.22643: done queuing things up, now waiting for results queue to drain 12033 1726867209.22644: waiting for pending results... 12033 1726867209.23003: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 12033 1726867209.23123: in run() - task 0affcac9-a3a5-74bb-502b-000000000a2f 12033 1726867209.23143: variable 'ansible_search_path' from source: unknown 12033 1726867209.23149: variable 'ansible_search_path' from source: unknown 12033 1726867209.23190: calling self._execute() 12033 1726867209.23296: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867209.23320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867209.23333: variable 'omit' from source: magic vars 12033 1726867209.23692: variable 'ansible_distribution_major_version' from source: facts 12033 1726867209.23709: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867209.23717: variable 'omit' from source: magic vars 12033 1726867209.23772: variable 'omit' from source: magic vars 12033 1726867209.23853: variable 'network_provider' from source: set_fact 12033 1726867209.23922: variable 'omit' from source: magic vars 12033 1726867209.23926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867209.23930: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867209.23945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867209.23963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867209.23974: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867209.24000: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867209.24005: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867209.24009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867209.24082: Set connection var ansible_pipelining to False 12033 1726867209.24090: Set connection var ansible_shell_executable to /bin/sh 12033 1726867209.24096: Set connection var ansible_timeout to 10 12033 1726867209.24102: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867209.24107: Set connection var ansible_connection to ssh 12033 1726867209.24112: Set connection var ansible_shell_type to sh 12033 1726867209.24128: variable 'ansible_shell_executable' from source: unknown 12033 1726867209.24132: variable 'ansible_connection' from source: unknown 12033 1726867209.24135: variable 'ansible_module_compression' from source: unknown 12033 1726867209.24138: variable 'ansible_shell_type' from source: unknown 12033 1726867209.24140: variable 'ansible_shell_executable' from source: unknown 12033 1726867209.24142: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867209.24145: variable 'ansible_pipelining' from source: unknown 12033 1726867209.24147: variable 'ansible_timeout' from source: unknown 12033 1726867209.24150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867209.24255: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867209.24262: variable 'omit' from source: magic vars 12033 1726867209.24272: starting attempt loop 12033 1726867209.24276: running the handler 12033 1726867209.24312: handler run complete 12033 1726867209.24323: attempt loop complete, returning result 12033 1726867209.24326: _execute() done 12033 1726867209.24328: dumping result to json 12033 1726867209.24330: done dumping result, returning 12033 1726867209.24337: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-74bb-502b-000000000a2f] 12033 1726867209.24342: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a2f ok: [managed_node3] => {} MSG: Using network provider: nm 12033 1726867209.24484: no more pending results, returning what we have 12033 1726867209.24489: results queue empty 12033 1726867209.24490: checking for any_errors_fatal 12033 1726867209.24503: done checking for any_errors_fatal 12033 1726867209.24504: checking for max_fail_percentage 12033 1726867209.24506: done checking for max_fail_percentage 12033 1726867209.24506: checking to see if all hosts have failed and the running result is not ok 12033 1726867209.24507: done checking to see if all hosts have failed 12033 1726867209.24508: getting the remaining hosts for this loop 12033 1726867209.24510: done getting the remaining hosts for this loop 12033 1726867209.24513: getting the next task for host managed_node3 12033 1726867209.24519: done getting next task for host managed_node3 12033 1726867209.24522: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12033 1726867209.24528: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867209.24537: getting variables 12033 1726867209.24539: in VariableManager get_vars() 12033 1726867209.24575: Calling all_inventory to load vars for managed_node3 12033 1726867209.24583: Calling groups_inventory to load vars for managed_node3 12033 1726867209.24586: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867209.24591: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a2f 12033 1726867209.24594: WORKER PROCESS EXITING 12033 1726867209.24604: Calling all_plugins_play to load vars for managed_node3 12033 1726867209.24607: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867209.24610: Calling groups_plugins_play to load vars for managed_node3 12033 1726867209.25499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867209.27021: done with get_vars() 12033 1726867209.27037: done getting variables 12033 1726867209.27075: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:20:09 -0400 (0:00:00.049) 0:00:48.386 ****** 12033 1726867209.27113: entering _queue_task() for managed_node3/fail 12033 1726867209.27337: worker is 1 (out of 1 available) 12033 1726867209.27351: exiting _queue_task() for managed_node3/fail 12033 1726867209.27362: done queuing things up, now waiting for results queue to drain 12033 1726867209.27364: waiting for pending results... 12033 1726867209.27552: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12033 1726867209.27664: in run() - task 0affcac9-a3a5-74bb-502b-000000000a30 12033 1726867209.27676: variable 'ansible_search_path' from source: unknown 12033 1726867209.27681: variable 'ansible_search_path' from source: unknown 12033 1726867209.27714: calling self._execute() 12033 1726867209.27784: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867209.27788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867209.27796: variable 'omit' from source: magic vars 12033 1726867209.28080: variable 'ansible_distribution_major_version' from source: facts 12033 1726867209.28090: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867209.28186: variable 'network_state' from source: role '' defaults 12033 1726867209.28197: Evaluated conditional (network_state != {}): False 12033 1726867209.28202: when evaluation is False, skipping this task 12033 1726867209.28205: _execute() done 12033 1726867209.28208: dumping result to json 12033 1726867209.28210: done dumping result, returning 12033 1726867209.28214: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-74bb-502b-000000000a30] 12033 1726867209.28220: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a30 12033 1726867209.28314: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a30 12033 1726867209.28317: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867209.28391: no more pending results, returning what we have 12033 1726867209.28394: results queue empty 12033 1726867209.28395: checking for any_errors_fatal 12033 1726867209.28398: done checking for any_errors_fatal 12033 1726867209.28399: checking for max_fail_percentage 12033 1726867209.28403: done checking for max_fail_percentage 12033 1726867209.28404: checking to see if all hosts have failed and the running result is not ok 12033 1726867209.28404: done checking to see if all hosts have failed 12033 1726867209.28405: getting the remaining hosts for this loop 12033 1726867209.28406: done getting the remaining hosts for this loop 12033 1726867209.28409: getting the next task for host managed_node3 12033 1726867209.28416: done getting next task for host managed_node3 12033 1726867209.28419: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12033 1726867209.28424: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867209.28440: getting variables 12033 1726867209.28442: in VariableManager get_vars() 12033 1726867209.28475: Calling all_inventory to load vars for managed_node3 12033 1726867209.28479: Calling groups_inventory to load vars for managed_node3 12033 1726867209.28482: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867209.28489: Calling all_plugins_play to load vars for managed_node3 12033 1726867209.28492: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867209.28494: Calling groups_plugins_play to load vars for managed_node3 12033 1726867209.29796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867209.30656: done with get_vars() 12033 1726867209.30670: done getting variables 12033 1726867209.30711: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:20:09 -0400 (0:00:00.036) 0:00:48.423 ****** 12033 1726867209.30737: entering _queue_task() for managed_node3/fail 12033 1726867209.30922: worker is 1 (out of 1 available) 12033 1726867209.30934: exiting _queue_task() for managed_node3/fail 12033 1726867209.30946: done queuing things up, now waiting for results queue to drain 12033 1726867209.30948: waiting for pending results... 12033 1726867209.31134: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12033 1726867209.31236: in run() - task 0affcac9-a3a5-74bb-502b-000000000a31 12033 1726867209.31248: variable 'ansible_search_path' from source: unknown 12033 1726867209.31251: variable 'ansible_search_path' from source: unknown 12033 1726867209.31286: calling self._execute() 12033 1726867209.31355: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867209.31359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867209.31367: variable 'omit' from source: magic vars 12033 1726867209.31648: variable 'ansible_distribution_major_version' from source: facts 12033 1726867209.31656: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867209.31741: variable 'network_state' from source: role '' defaults 12033 1726867209.31750: Evaluated conditional (network_state != {}): False 12033 1726867209.31753: when evaluation is False, skipping this task 12033 1726867209.31756: _execute() done 12033 1726867209.31758: dumping result to json 12033 1726867209.31761: done dumping result, returning 12033 1726867209.31768: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-74bb-502b-000000000a31] 12033 1726867209.31773: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a31 12033 1726867209.31860: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a31 12033 1726867209.31863: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867209.31910: no more pending results, returning what we have 12033 1726867209.31913: results queue empty 12033 1726867209.31914: checking for any_errors_fatal 12033 1726867209.31919: done checking for any_errors_fatal 12033 1726867209.31920: checking for max_fail_percentage 12033 1726867209.31922: done checking for max_fail_percentage 12033 1726867209.31922: checking to see if all hosts have failed and the running result is not ok 12033 1726867209.31923: done checking to see if all hosts have failed 12033 1726867209.31924: getting the remaining hosts for this loop 12033 1726867209.31926: done getting the remaining hosts for this loop 12033 1726867209.31928: getting the next task for host managed_node3 12033 1726867209.31935: done getting next task for host managed_node3 12033 1726867209.31938: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12033 1726867209.31943: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867209.31959: getting variables 12033 1726867209.31961: in VariableManager get_vars() 12033 1726867209.31995: Calling all_inventory to load vars for managed_node3 12033 1726867209.31998: Calling groups_inventory to load vars for managed_node3 12033 1726867209.32000: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867209.32008: Calling all_plugins_play to load vars for managed_node3 12033 1726867209.32011: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867209.32013: Calling groups_plugins_play to load vars for managed_node3 12033 1726867209.32722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867209.33593: done with get_vars() 12033 1726867209.33608: done getting variables 12033 1726867209.33646: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:20:09 -0400 (0:00:00.029) 0:00:48.452 ****** 12033 1726867209.33671: entering _queue_task() for managed_node3/fail 12033 1726867209.33853: worker is 1 (out of 1 available) 12033 1726867209.33865: exiting _queue_task() for managed_node3/fail 12033 1726867209.33880: done queuing things up, now waiting for results queue to drain 12033 1726867209.33881: waiting for pending results... 12033 1726867209.34050: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12033 1726867209.34159: in run() - task 0affcac9-a3a5-74bb-502b-000000000a32 12033 1726867209.34171: variable 'ansible_search_path' from source: unknown 12033 1726867209.34174: variable 'ansible_search_path' from source: unknown 12033 1726867209.34209: calling self._execute() 12033 1726867209.34273: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867209.34280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867209.34287: variable 'omit' from source: magic vars 12033 1726867209.34550: variable 'ansible_distribution_major_version' from source: facts 12033 1726867209.34561: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867209.34679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867209.36405: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867209.36444: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867209.36471: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867209.36497: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867209.36522: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867209.36575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.36599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.36621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.36647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.36658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.36731: variable 'ansible_distribution_major_version' from source: facts 12033 1726867209.36738: Evaluated conditional (ansible_distribution_major_version | int > 9): True 12033 1726867209.36814: variable 'ansible_distribution' from source: facts 12033 1726867209.36818: variable '__network_rh_distros' from source: role '' defaults 12033 1726867209.36825: Evaluated conditional (ansible_distribution in __network_rh_distros): True 12033 1726867209.36982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.37016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.37033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.37062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.37072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.37109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.37125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.37141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.37167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.37179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.37210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.37226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.37242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.37271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.37281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.37466: variable 'network_connections' from source: task vars 12033 1726867209.37473: variable 'controller_profile' from source: play vars 12033 1726867209.37524: variable 'controller_profile' from source: play vars 12033 1726867209.37532: variable 'controller_device' from source: play vars 12033 1726867209.37573: variable 'controller_device' from source: play vars 12033 1726867209.37583: variable 'dhcp_interface1' from source: play vars 12033 1726867209.37629: variable 'dhcp_interface1' from source: play vars 12033 1726867209.37636: variable 'port1_profile' from source: play vars 12033 1726867209.37679: variable 'port1_profile' from source: play vars 12033 1726867209.37685: variable 'dhcp_interface1' from source: play vars 12033 1726867209.37731: variable 'dhcp_interface1' from source: play vars 12033 1726867209.37736: variable 'controller_profile' from source: play vars 12033 1726867209.37776: variable 'controller_profile' from source: play vars 12033 1726867209.37784: variable 'port2_profile' from source: play vars 12033 1726867209.37830: variable 'port2_profile' from source: play vars 12033 1726867209.37836: variable 'dhcp_interface2' from source: play vars 12033 1726867209.37876: variable 'dhcp_interface2' from source: play vars 12033 1726867209.37889: variable 'controller_profile' from source: play vars 12033 1726867209.37931: variable 'controller_profile' from source: play vars 12033 1726867209.37939: variable 'network_state' from source: role '' defaults 12033 1726867209.37982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867209.38092: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867209.38121: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867209.38144: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867209.38165: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867209.38196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867209.38214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867209.38231: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.38251: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867209.38281: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 12033 1726867209.38284: when evaluation is False, skipping this task 12033 1726867209.38287: _execute() done 12033 1726867209.38289: dumping result to json 12033 1726867209.38291: done dumping result, returning 12033 1726867209.38297: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-74bb-502b-000000000a32] 12033 1726867209.38303: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a32 12033 1726867209.38389: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a32 12033 1726867209.38392: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 12033 1726867209.38433: no more pending results, returning what we have 12033 1726867209.38437: results queue empty 12033 1726867209.38438: checking for any_errors_fatal 12033 1726867209.38442: done checking for any_errors_fatal 12033 1726867209.38443: checking for max_fail_percentage 12033 1726867209.38446: done checking for max_fail_percentage 12033 1726867209.38447: checking to see if all hosts have failed and the running result is not ok 12033 1726867209.38448: done checking to see if all hosts have failed 12033 1726867209.38448: getting the remaining hosts for this loop 12033 1726867209.38450: done getting the remaining hosts for this loop 12033 1726867209.38453: getting the next task for host managed_node3 12033 1726867209.38461: done getting next task for host managed_node3 12033 1726867209.38464: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12033 1726867209.38469: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867209.38488: getting variables 12033 1726867209.38490: in VariableManager get_vars() 12033 1726867209.38526: Calling all_inventory to load vars for managed_node3 12033 1726867209.38529: Calling groups_inventory to load vars for managed_node3 12033 1726867209.38531: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867209.38539: Calling all_plugins_play to load vars for managed_node3 12033 1726867209.38542: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867209.38544: Calling groups_plugins_play to load vars for managed_node3 12033 1726867209.39416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867209.40267: done with get_vars() 12033 1726867209.40283: done getting variables 12033 1726867209.40324: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:20:09 -0400 (0:00:00.066) 0:00:48.519 ****** 12033 1726867209.40347: entering _queue_task() for managed_node3/dnf 12033 1726867209.40553: worker is 1 (out of 1 available) 12033 1726867209.40567: exiting _queue_task() for managed_node3/dnf 12033 1726867209.40582: done queuing things up, now waiting for results queue to drain 12033 1726867209.40584: waiting for pending results... 12033 1726867209.40758: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12033 1726867209.40872: in run() - task 0affcac9-a3a5-74bb-502b-000000000a33 12033 1726867209.40887: variable 'ansible_search_path' from source: unknown 12033 1726867209.40891: variable 'ansible_search_path' from source: unknown 12033 1726867209.40924: calling self._execute() 12033 1726867209.40992: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867209.40996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867209.41032: variable 'omit' from source: magic vars 12033 1726867209.41278: variable 'ansible_distribution_major_version' from source: facts 12033 1726867209.41287: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867209.41419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867209.42892: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867209.42934: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867209.42961: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867209.42988: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867209.43010: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867209.43064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.43099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.43119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.43144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.43155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.43237: variable 'ansible_distribution' from source: facts 12033 1726867209.43241: variable 'ansible_distribution_major_version' from source: facts 12033 1726867209.43254: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12033 1726867209.43329: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867209.43409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.43428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.43447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.43471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.43483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.43511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.43528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.43545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.43571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.43584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.43613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.43629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.43648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.43675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.43688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.43792: variable 'network_connections' from source: task vars 12033 1726867209.43802: variable 'controller_profile' from source: play vars 12033 1726867209.43846: variable 'controller_profile' from source: play vars 12033 1726867209.43854: variable 'controller_device' from source: play vars 12033 1726867209.43899: variable 'controller_device' from source: play vars 12033 1726867209.43909: variable 'dhcp_interface1' from source: play vars 12033 1726867209.43949: variable 'dhcp_interface1' from source: play vars 12033 1726867209.43959: variable 'port1_profile' from source: play vars 12033 1726867209.44005: variable 'port1_profile' from source: play vars 12033 1726867209.44012: variable 'dhcp_interface1' from source: play vars 12033 1726867209.44054: variable 'dhcp_interface1' from source: play vars 12033 1726867209.44059: variable 'controller_profile' from source: play vars 12033 1726867209.44106: variable 'controller_profile' from source: play vars 12033 1726867209.44113: variable 'port2_profile' from source: play vars 12033 1726867209.44153: variable 'port2_profile' from source: play vars 12033 1726867209.44159: variable 'dhcp_interface2' from source: play vars 12033 1726867209.44207: variable 'dhcp_interface2' from source: play vars 12033 1726867209.44217: variable 'controller_profile' from source: play vars 12033 1726867209.44255: variable 'controller_profile' from source: play vars 12033 1726867209.44304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867209.44415: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867209.44444: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867209.44467: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867209.44490: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867209.44535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867209.44551: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867209.44568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.44588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867209.44636: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867209.44809: variable 'network_connections' from source: task vars 12033 1726867209.44813: variable 'controller_profile' from source: play vars 12033 1726867209.44858: variable 'controller_profile' from source: play vars 12033 1726867209.44864: variable 'controller_device' from source: play vars 12033 1726867209.44910: variable 'controller_device' from source: play vars 12033 1726867209.44917: variable 'dhcp_interface1' from source: play vars 12033 1726867209.44961: variable 'dhcp_interface1' from source: play vars 12033 1726867209.44968: variable 'port1_profile' from source: play vars 12033 1726867209.45013: variable 'port1_profile' from source: play vars 12033 1726867209.45019: variable 'dhcp_interface1' from source: play vars 12033 1726867209.45060: variable 'dhcp_interface1' from source: play vars 12033 1726867209.45065: variable 'controller_profile' from source: play vars 12033 1726867209.45111: variable 'controller_profile' from source: play vars 12033 1726867209.45117: variable 'port2_profile' from source: play vars 12033 1726867209.45160: variable 'port2_profile' from source: play vars 12033 1726867209.45163: variable 'dhcp_interface2' from source: play vars 12033 1726867209.45209: variable 'dhcp_interface2' from source: play vars 12033 1726867209.45214: variable 'controller_profile' from source: play vars 12033 1726867209.45256: variable 'controller_profile' from source: play vars 12033 1726867209.45281: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12033 1726867209.45284: when evaluation is False, skipping this task 12033 1726867209.45287: _execute() done 12033 1726867209.45289: dumping result to json 12033 1726867209.45292: done dumping result, returning 12033 1726867209.45298: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-74bb-502b-000000000a33] 12033 1726867209.45305: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a33 12033 1726867209.45388: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a33 12033 1726867209.45390: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12033 1726867209.45439: no more pending results, returning what we have 12033 1726867209.45443: results queue empty 12033 1726867209.45444: checking for any_errors_fatal 12033 1726867209.45452: done checking for any_errors_fatal 12033 1726867209.45452: checking for max_fail_percentage 12033 1726867209.45454: done checking for max_fail_percentage 12033 1726867209.45455: checking to see if all hosts have failed and the running result is not ok 12033 1726867209.45456: done checking to see if all hosts have failed 12033 1726867209.45457: getting the remaining hosts for this loop 12033 1726867209.45459: done getting the remaining hosts for this loop 12033 1726867209.45462: getting the next task for host managed_node3 12033 1726867209.45469: done getting next task for host managed_node3 12033 1726867209.45472: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12033 1726867209.45478: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867209.45496: getting variables 12033 1726867209.45497: in VariableManager get_vars() 12033 1726867209.45535: Calling all_inventory to load vars for managed_node3 12033 1726867209.45537: Calling groups_inventory to load vars for managed_node3 12033 1726867209.45539: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867209.45547: Calling all_plugins_play to load vars for managed_node3 12033 1726867209.45550: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867209.45552: Calling groups_plugins_play to load vars for managed_node3 12033 1726867209.46534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867209.47645: done with get_vars() 12033 1726867209.47660: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12033 1726867209.47711: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:20:09 -0400 (0:00:00.073) 0:00:48.593 ****** 12033 1726867209.47733: entering _queue_task() for managed_node3/yum 12033 1726867209.47930: worker is 1 (out of 1 available) 12033 1726867209.47944: exiting _queue_task() for managed_node3/yum 12033 1726867209.47955: done queuing things up, now waiting for results queue to drain 12033 1726867209.47957: waiting for pending results... 12033 1726867209.48144: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12033 1726867209.48248: in run() - task 0affcac9-a3a5-74bb-502b-000000000a34 12033 1726867209.48259: variable 'ansible_search_path' from source: unknown 12033 1726867209.48263: variable 'ansible_search_path' from source: unknown 12033 1726867209.48297: calling self._execute() 12033 1726867209.48364: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867209.48367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867209.48375: variable 'omit' from source: magic vars 12033 1726867209.48645: variable 'ansible_distribution_major_version' from source: facts 12033 1726867209.48655: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867209.48772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867209.51383: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867209.51386: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867209.51389: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867209.51421: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867209.51451: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867209.51530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.51566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.51607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.51651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.51670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.51764: variable 'ansible_distribution_major_version' from source: facts 12033 1726867209.51788: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12033 1726867209.51795: when evaluation is False, skipping this task 12033 1726867209.51804: _execute() done 12033 1726867209.51811: dumping result to json 12033 1726867209.51817: done dumping result, returning 12033 1726867209.51828: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-74bb-502b-000000000a34] 12033 1726867209.51836: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a34 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12033 1726867209.51997: no more pending results, returning what we have 12033 1726867209.52001: results queue empty 12033 1726867209.52002: checking for any_errors_fatal 12033 1726867209.52009: done checking for any_errors_fatal 12033 1726867209.52009: checking for max_fail_percentage 12033 1726867209.52011: done checking for max_fail_percentage 12033 1726867209.52012: checking to see if all hosts have failed and the running result is not ok 12033 1726867209.52013: done checking to see if all hosts have failed 12033 1726867209.52013: getting the remaining hosts for this loop 12033 1726867209.52015: done getting the remaining hosts for this loop 12033 1726867209.52018: getting the next task for host managed_node3 12033 1726867209.52025: done getting next task for host managed_node3 12033 1726867209.52028: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12033 1726867209.52032: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867209.52050: getting variables 12033 1726867209.52052: in VariableManager get_vars() 12033 1726867209.52202: Calling all_inventory to load vars for managed_node3 12033 1726867209.52205: Calling groups_inventory to load vars for managed_node3 12033 1726867209.52207: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867209.52216: Calling all_plugins_play to load vars for managed_node3 12033 1726867209.52219: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867209.52222: Calling groups_plugins_play to load vars for managed_node3 12033 1726867209.52740: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a34 12033 1726867209.52743: WORKER PROCESS EXITING 12033 1726867209.53618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867209.55225: done with get_vars() 12033 1726867209.55246: done getting variables 12033 1726867209.55302: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:20:09 -0400 (0:00:00.075) 0:00:48.669 ****** 12033 1726867209.55336: entering _queue_task() for managed_node3/fail 12033 1726867209.55797: worker is 1 (out of 1 available) 12033 1726867209.55811: exiting _queue_task() for managed_node3/fail 12033 1726867209.55820: done queuing things up, now waiting for results queue to drain 12033 1726867209.55822: waiting for pending results... 12033 1726867209.55952: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12033 1726867209.56092: in run() - task 0affcac9-a3a5-74bb-502b-000000000a35 12033 1726867209.56156: variable 'ansible_search_path' from source: unknown 12033 1726867209.56159: variable 'ansible_search_path' from source: unknown 12033 1726867209.56162: calling self._execute() 12033 1726867209.56246: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867209.56257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867209.56275: variable 'omit' from source: magic vars 12033 1726867209.56684: variable 'ansible_distribution_major_version' from source: facts 12033 1726867209.56708: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867209.56840: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867209.57094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867209.60086: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867209.60152: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867209.60197: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867209.60241: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867209.60276: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867209.60470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.60474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.60479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.60493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.60516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.60564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.60597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.60628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.60672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.60695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.60741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.60768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.60806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.60848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.60866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.61047: variable 'network_connections' from source: task vars 12033 1726867209.61064: variable 'controller_profile' from source: play vars 12033 1726867209.61141: variable 'controller_profile' from source: play vars 12033 1726867209.61156: variable 'controller_device' from source: play vars 12033 1726867209.61226: variable 'controller_device' from source: play vars 12033 1726867209.61240: variable 'dhcp_interface1' from source: play vars 12033 1726867209.61305: variable 'dhcp_interface1' from source: play vars 12033 1726867209.61334: variable 'port1_profile' from source: play vars 12033 1726867209.61385: variable 'port1_profile' from source: play vars 12033 1726867209.61443: variable 'dhcp_interface1' from source: play vars 12033 1726867209.61465: variable 'dhcp_interface1' from source: play vars 12033 1726867209.61475: variable 'controller_profile' from source: play vars 12033 1726867209.61540: variable 'controller_profile' from source: play vars 12033 1726867209.61556: variable 'port2_profile' from source: play vars 12033 1726867209.61620: variable 'port2_profile' from source: play vars 12033 1726867209.61631: variable 'dhcp_interface2' from source: play vars 12033 1726867209.61698: variable 'dhcp_interface2' from source: play vars 12033 1726867209.61768: variable 'controller_profile' from source: play vars 12033 1726867209.61779: variable 'controller_profile' from source: play vars 12033 1726867209.61853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867209.62311: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867209.62356: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867209.62396: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867209.62515: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867209.62564: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867209.62658: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867209.62692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.62771: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867209.63068: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867209.63565: variable 'network_connections' from source: task vars 12033 1726867209.63568: variable 'controller_profile' from source: play vars 12033 1726867209.63631: variable 'controller_profile' from source: play vars 12033 1726867209.63782: variable 'controller_device' from source: play vars 12033 1726867209.63786: variable 'controller_device' from source: play vars 12033 1726867209.63788: variable 'dhcp_interface1' from source: play vars 12033 1726867209.63993: variable 'dhcp_interface1' from source: play vars 12033 1726867209.64002: variable 'port1_profile' from source: play vars 12033 1726867209.64067: variable 'port1_profile' from source: play vars 12033 1726867209.64139: variable 'dhcp_interface1' from source: play vars 12033 1726867209.64309: variable 'dhcp_interface1' from source: play vars 12033 1726867209.64322: variable 'controller_profile' from source: play vars 12033 1726867209.64449: variable 'controller_profile' from source: play vars 12033 1726867209.64716: variable 'port2_profile' from source: play vars 12033 1726867209.64718: variable 'port2_profile' from source: play vars 12033 1726867209.64721: variable 'dhcp_interface2' from source: play vars 12033 1726867209.64899: variable 'dhcp_interface2' from source: play vars 12033 1726867209.64905: variable 'controller_profile' from source: play vars 12033 1726867209.64964: variable 'controller_profile' from source: play vars 12033 1726867209.65117: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12033 1726867209.65120: when evaluation is False, skipping this task 12033 1726867209.65123: _execute() done 12033 1726867209.65125: dumping result to json 12033 1726867209.65128: done dumping result, returning 12033 1726867209.65130: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-74bb-502b-000000000a35] 12033 1726867209.65132: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a35 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12033 1726867209.65410: no more pending results, returning what we have 12033 1726867209.65414: results queue empty 12033 1726867209.65415: checking for any_errors_fatal 12033 1726867209.65422: done checking for any_errors_fatal 12033 1726867209.65423: checking for max_fail_percentage 12033 1726867209.65425: done checking for max_fail_percentage 12033 1726867209.65426: checking to see if all hosts have failed and the running result is not ok 12033 1726867209.65426: done checking to see if all hosts have failed 12033 1726867209.65427: getting the remaining hosts for this loop 12033 1726867209.65429: done getting the remaining hosts for this loop 12033 1726867209.65432: getting the next task for host managed_node3 12033 1726867209.65441: done getting next task for host managed_node3 12033 1726867209.65445: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12033 1726867209.65450: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867209.65470: getting variables 12033 1726867209.65471: in VariableManager get_vars() 12033 1726867209.65626: Calling all_inventory to load vars for managed_node3 12033 1726867209.65629: Calling groups_inventory to load vars for managed_node3 12033 1726867209.65632: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867209.65909: Calling all_plugins_play to load vars for managed_node3 12033 1726867209.65913: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867209.65916: Calling groups_plugins_play to load vars for managed_node3 12033 1726867209.66590: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a35 12033 1726867209.66594: WORKER PROCESS EXITING 12033 1726867209.67192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867209.69627: done with get_vars() 12033 1726867209.69649: done getting variables 12033 1726867209.69710: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:20:09 -0400 (0:00:00.144) 0:00:48.813 ****** 12033 1726867209.69746: entering _queue_task() for managed_node3/package 12033 1726867209.70058: worker is 1 (out of 1 available) 12033 1726867209.70070: exiting _queue_task() for managed_node3/package 12033 1726867209.70084: done queuing things up, now waiting for results queue to drain 12033 1726867209.70085: waiting for pending results... 12033 1726867209.70334: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 12033 1726867209.70502: in run() - task 0affcac9-a3a5-74bb-502b-000000000a36 12033 1726867209.70525: variable 'ansible_search_path' from source: unknown 12033 1726867209.70534: variable 'ansible_search_path' from source: unknown 12033 1726867209.70575: calling self._execute() 12033 1726867209.70689: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867209.70702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867209.70716: variable 'omit' from source: magic vars 12033 1726867209.71103: variable 'ansible_distribution_major_version' from source: facts 12033 1726867209.71124: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867209.71335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867209.71710: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867209.71763: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867209.71818: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867209.71928: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867209.72046: variable 'network_packages' from source: role '' defaults 12033 1726867209.72254: variable '__network_provider_setup' from source: role '' defaults 12033 1726867209.72257: variable '__network_service_name_default_nm' from source: role '' defaults 12033 1726867209.72260: variable '__network_service_name_default_nm' from source: role '' defaults 12033 1726867209.72262: variable '__network_packages_default_nm' from source: role '' defaults 12033 1726867209.72324: variable '__network_packages_default_nm' from source: role '' defaults 12033 1726867209.72522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867209.74985: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867209.74988: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867209.74991: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867209.74993: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867209.74995: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867209.75145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.75185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.75287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.75339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.75453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.75507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.75535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.75762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.75765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.75768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.76039: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12033 1726867209.76164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.76204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.76236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.76279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.76304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.76395: variable 'ansible_python' from source: facts 12033 1726867209.76425: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12033 1726867209.76513: variable '__network_wpa_supplicant_required' from source: role '' defaults 12033 1726867209.76595: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12033 1726867209.76739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.76768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.76802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.76851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.76871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.76924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867209.76966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867209.77067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.77070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867209.77072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867209.77213: variable 'network_connections' from source: task vars 12033 1726867209.77225: variable 'controller_profile' from source: play vars 12033 1726867209.77333: variable 'controller_profile' from source: play vars 12033 1726867209.77349: variable 'controller_device' from source: play vars 12033 1726867209.77458: variable 'controller_device' from source: play vars 12033 1726867209.77474: variable 'dhcp_interface1' from source: play vars 12033 1726867209.77571: variable 'dhcp_interface1' from source: play vars 12033 1726867209.77598: variable 'port1_profile' from source: play vars 12033 1726867209.77762: variable 'port1_profile' from source: play vars 12033 1726867209.77776: variable 'dhcp_interface1' from source: play vars 12033 1726867209.77935: variable 'dhcp_interface1' from source: play vars 12033 1726867209.77938: variable 'controller_profile' from source: play vars 12033 1726867209.78007: variable 'controller_profile' from source: play vars 12033 1726867209.78023: variable 'port2_profile' from source: play vars 12033 1726867209.78262: variable 'port2_profile' from source: play vars 12033 1726867209.78279: variable 'dhcp_interface2' from source: play vars 12033 1726867209.78586: variable 'dhcp_interface2' from source: play vars 12033 1726867209.78694: variable 'controller_profile' from source: play vars 12033 1726867209.78793: variable 'controller_profile' from source: play vars 12033 1726867209.78874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867209.78940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867209.79237: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867209.79241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867209.79243: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867209.79618: variable 'network_connections' from source: task vars 12033 1726867209.79629: variable 'controller_profile' from source: play vars 12033 1726867209.79735: variable 'controller_profile' from source: play vars 12033 1726867209.79750: variable 'controller_device' from source: play vars 12033 1726867209.79857: variable 'controller_device' from source: play vars 12033 1726867209.79874: variable 'dhcp_interface1' from source: play vars 12033 1726867209.79955: variable 'dhcp_interface1' from source: play vars 12033 1726867209.79970: variable 'port1_profile' from source: play vars 12033 1726867209.80076: variable 'port1_profile' from source: play vars 12033 1726867209.80098: variable 'dhcp_interface1' from source: play vars 12033 1726867209.80204: variable 'dhcp_interface1' from source: play vars 12033 1726867209.80223: variable 'controller_profile' from source: play vars 12033 1726867209.80328: variable 'controller_profile' from source: play vars 12033 1726867209.80343: variable 'port2_profile' from source: play vars 12033 1726867209.80447: variable 'port2_profile' from source: play vars 12033 1726867209.80462: variable 'dhcp_interface2' from source: play vars 12033 1726867209.80571: variable 'dhcp_interface2' from source: play vars 12033 1726867209.80592: variable 'controller_profile' from source: play vars 12033 1726867209.80699: variable 'controller_profile' from source: play vars 12033 1726867209.80764: variable '__network_packages_default_wireless' from source: role '' defaults 12033 1726867209.80852: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867209.81193: variable 'network_connections' from source: task vars 12033 1726867209.81207: variable 'controller_profile' from source: play vars 12033 1726867209.81272: variable 'controller_profile' from source: play vars 12033 1726867209.81287: variable 'controller_device' from source: play vars 12033 1726867209.81360: variable 'controller_device' from source: play vars 12033 1726867209.81380: variable 'dhcp_interface1' from source: play vars 12033 1726867209.81461: variable 'dhcp_interface1' from source: play vars 12033 1726867209.81489: variable 'port1_profile' from source: play vars 12033 1726867209.81596: variable 'port1_profile' from source: play vars 12033 1726867209.81611: variable 'dhcp_interface1' from source: play vars 12033 1726867209.81673: variable 'dhcp_interface1' from source: play vars 12033 1726867209.81688: variable 'controller_profile' from source: play vars 12033 1726867209.81760: variable 'controller_profile' from source: play vars 12033 1726867209.81774: variable 'port2_profile' from source: play vars 12033 1726867209.81847: variable 'port2_profile' from source: play vars 12033 1726867209.81859: variable 'dhcp_interface2' from source: play vars 12033 1726867209.81932: variable 'dhcp_interface2' from source: play vars 12033 1726867209.81984: variable 'controller_profile' from source: play vars 12033 1726867209.82014: variable 'controller_profile' from source: play vars 12033 1726867209.82046: variable '__network_packages_default_team' from source: role '' defaults 12033 1726867209.82129: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867209.82459: variable 'network_connections' from source: task vars 12033 1726867209.82472: variable 'controller_profile' from source: play vars 12033 1726867209.82576: variable 'controller_profile' from source: play vars 12033 1726867209.82582: variable 'controller_device' from source: play vars 12033 1726867209.82629: variable 'controller_device' from source: play vars 12033 1726867209.82642: variable 'dhcp_interface1' from source: play vars 12033 1726867209.82716: variable 'dhcp_interface1' from source: play vars 12033 1726867209.82743: variable 'port1_profile' from source: play vars 12033 1726867209.82817: variable 'port1_profile' from source: play vars 12033 1726867209.82903: variable 'dhcp_interface1' from source: play vars 12033 1726867209.82906: variable 'dhcp_interface1' from source: play vars 12033 1726867209.82908: variable 'controller_profile' from source: play vars 12033 1726867209.82968: variable 'controller_profile' from source: play vars 12033 1726867209.83019: variable 'port2_profile' from source: play vars 12033 1726867209.83227: variable 'port2_profile' from source: play vars 12033 1726867209.83231: variable 'dhcp_interface2' from source: play vars 12033 1726867209.83315: variable 'dhcp_interface2' from source: play vars 12033 1726867209.83345: variable 'controller_profile' from source: play vars 12033 1726867209.83493: variable 'controller_profile' from source: play vars 12033 1726867209.83772: variable '__network_service_name_default_initscripts' from source: role '' defaults 12033 1726867209.83775: variable '__network_service_name_default_initscripts' from source: role '' defaults 12033 1726867209.83784: variable '__network_packages_default_initscripts' from source: role '' defaults 12033 1726867209.83911: variable '__network_packages_default_initscripts' from source: role '' defaults 12033 1726867209.84318: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12033 1726867209.84846: variable 'network_connections' from source: task vars 12033 1726867209.84861: variable 'controller_profile' from source: play vars 12033 1726867209.84924: variable 'controller_profile' from source: play vars 12033 1726867209.84935: variable 'controller_device' from source: play vars 12033 1726867209.85004: variable 'controller_device' from source: play vars 12033 1726867209.85018: variable 'dhcp_interface1' from source: play vars 12033 1726867209.85086: variable 'dhcp_interface1' from source: play vars 12033 1726867209.85099: variable 'port1_profile' from source: play vars 12033 1726867209.85159: variable 'port1_profile' from source: play vars 12033 1726867209.85172: variable 'dhcp_interface1' from source: play vars 12033 1726867209.85236: variable 'dhcp_interface1' from source: play vars 12033 1726867209.85249: variable 'controller_profile' from source: play vars 12033 1726867209.85317: variable 'controller_profile' from source: play vars 12033 1726867209.85400: variable 'port2_profile' from source: play vars 12033 1726867209.85403: variable 'port2_profile' from source: play vars 12033 1726867209.85406: variable 'dhcp_interface2' from source: play vars 12033 1726867209.85462: variable 'dhcp_interface2' from source: play vars 12033 1726867209.85474: variable 'controller_profile' from source: play vars 12033 1726867209.85541: variable 'controller_profile' from source: play vars 12033 1726867209.85554: variable 'ansible_distribution' from source: facts 12033 1726867209.85563: variable '__network_rh_distros' from source: role '' defaults 12033 1726867209.85573: variable 'ansible_distribution_major_version' from source: facts 12033 1726867209.85605: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12033 1726867209.85770: variable 'ansible_distribution' from source: facts 12033 1726867209.85781: variable '__network_rh_distros' from source: role '' defaults 12033 1726867209.85792: variable 'ansible_distribution_major_version' from source: facts 12033 1726867209.85809: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12033 1726867209.85987: variable 'ansible_distribution' from source: facts 12033 1726867209.86054: variable '__network_rh_distros' from source: role '' defaults 12033 1726867209.86057: variable 'ansible_distribution_major_version' from source: facts 12033 1726867209.86060: variable 'network_provider' from source: set_fact 12033 1726867209.86072: variable 'ansible_facts' from source: unknown 12033 1726867209.86796: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12033 1726867209.86808: when evaluation is False, skipping this task 12033 1726867209.86815: _execute() done 12033 1726867209.86821: dumping result to json 12033 1726867209.86832: done dumping result, returning 12033 1726867209.86844: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-74bb-502b-000000000a36] 12033 1726867209.86859: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a36 12033 1726867209.87143: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a36 12033 1726867209.87146: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12033 1726867209.87202: no more pending results, returning what we have 12033 1726867209.87207: results queue empty 12033 1726867209.87208: checking for any_errors_fatal 12033 1726867209.87214: done checking for any_errors_fatal 12033 1726867209.87215: checking for max_fail_percentage 12033 1726867209.87217: done checking for max_fail_percentage 12033 1726867209.87218: checking to see if all hosts have failed and the running result is not ok 12033 1726867209.87219: done checking to see if all hosts have failed 12033 1726867209.87219: getting the remaining hosts for this loop 12033 1726867209.87221: done getting the remaining hosts for this loop 12033 1726867209.87225: getting the next task for host managed_node3 12033 1726867209.87233: done getting next task for host managed_node3 12033 1726867209.87236: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12033 1726867209.87241: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867209.87261: getting variables 12033 1726867209.87263: in VariableManager get_vars() 12033 1726867209.87508: Calling all_inventory to load vars for managed_node3 12033 1726867209.87511: Calling groups_inventory to load vars for managed_node3 12033 1726867209.87513: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867209.87522: Calling all_plugins_play to load vars for managed_node3 12033 1726867209.87525: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867209.87528: Calling groups_plugins_play to load vars for managed_node3 12033 1726867209.88800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867209.91154: done with get_vars() 12033 1726867209.91173: done getting variables 12033 1726867209.91218: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:20:09 -0400 (0:00:00.214) 0:00:49.028 ****** 12033 1726867209.91243: entering _queue_task() for managed_node3/package 12033 1726867209.91476: worker is 1 (out of 1 available) 12033 1726867209.91491: exiting _queue_task() for managed_node3/package 12033 1726867209.91503: done queuing things up, now waiting for results queue to drain 12033 1726867209.91505: waiting for pending results... 12033 1726867209.91687: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12033 1726867209.91797: in run() - task 0affcac9-a3a5-74bb-502b-000000000a37 12033 1726867209.91811: variable 'ansible_search_path' from source: unknown 12033 1726867209.91815: variable 'ansible_search_path' from source: unknown 12033 1726867209.91845: calling self._execute() 12033 1726867209.91917: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867209.91921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867209.91930: variable 'omit' from source: magic vars 12033 1726867209.92205: variable 'ansible_distribution_major_version' from source: facts 12033 1726867209.92213: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867209.92295: variable 'network_state' from source: role '' defaults 12033 1726867209.92305: Evaluated conditional (network_state != {}): False 12033 1726867209.92308: when evaluation is False, skipping this task 12033 1726867209.92311: _execute() done 12033 1726867209.92314: dumping result to json 12033 1726867209.92316: done dumping result, returning 12033 1726867209.92322: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-74bb-502b-000000000a37] 12033 1726867209.92327: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a37 12033 1726867209.92421: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a37 12033 1726867209.92424: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867209.92468: no more pending results, returning what we have 12033 1726867209.92472: results queue empty 12033 1726867209.92473: checking for any_errors_fatal 12033 1726867209.92480: done checking for any_errors_fatal 12033 1726867209.92481: checking for max_fail_percentage 12033 1726867209.92483: done checking for max_fail_percentage 12033 1726867209.92484: checking to see if all hosts have failed and the running result is not ok 12033 1726867209.92485: done checking to see if all hosts have failed 12033 1726867209.92485: getting the remaining hosts for this loop 12033 1726867209.92487: done getting the remaining hosts for this loop 12033 1726867209.92490: getting the next task for host managed_node3 12033 1726867209.92497: done getting next task for host managed_node3 12033 1726867209.92502: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12033 1726867209.92508: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867209.92526: getting variables 12033 1726867209.92528: in VariableManager get_vars() 12033 1726867209.92722: Calling all_inventory to load vars for managed_node3 12033 1726867209.92725: Calling groups_inventory to load vars for managed_node3 12033 1726867209.92728: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867209.92736: Calling all_plugins_play to load vars for managed_node3 12033 1726867209.92738: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867209.92741: Calling groups_plugins_play to load vars for managed_node3 12033 1726867209.94850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867209.96682: done with get_vars() 12033 1726867209.96828: done getting variables 12033 1726867209.97010: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:20:09 -0400 (0:00:00.058) 0:00:49.086 ****** 12033 1726867209.97049: entering _queue_task() for managed_node3/package 12033 1726867209.97404: worker is 1 (out of 1 available) 12033 1726867209.97417: exiting _queue_task() for managed_node3/package 12033 1726867209.97429: done queuing things up, now waiting for results queue to drain 12033 1726867209.97431: waiting for pending results... 12033 1726867209.97904: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12033 1726867209.97922: in run() - task 0affcac9-a3a5-74bb-502b-000000000a38 12033 1726867209.97944: variable 'ansible_search_path' from source: unknown 12033 1726867209.97954: variable 'ansible_search_path' from source: unknown 12033 1726867209.98011: calling self._execute() 12033 1726867209.98118: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867209.98129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867209.98181: variable 'omit' from source: magic vars 12033 1726867209.98729: variable 'ansible_distribution_major_version' from source: facts 12033 1726867209.98784: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867209.98887: variable 'network_state' from source: role '' defaults 12033 1726867209.98909: Evaluated conditional (network_state != {}): False 12033 1726867209.98921: when evaluation is False, skipping this task 12033 1726867209.98931: _execute() done 12033 1726867209.98940: dumping result to json 12033 1726867209.98984: done dumping result, returning 12033 1726867209.98988: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-74bb-502b-000000000a38] 12033 1726867209.98990: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a38 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867209.99137: no more pending results, returning what we have 12033 1726867209.99141: results queue empty 12033 1726867209.99142: checking for any_errors_fatal 12033 1726867209.99148: done checking for any_errors_fatal 12033 1726867209.99149: checking for max_fail_percentage 12033 1726867209.99151: done checking for max_fail_percentage 12033 1726867209.99152: checking to see if all hosts have failed and the running result is not ok 12033 1726867209.99153: done checking to see if all hosts have failed 12033 1726867209.99153: getting the remaining hosts for this loop 12033 1726867209.99155: done getting the remaining hosts for this loop 12033 1726867209.99158: getting the next task for host managed_node3 12033 1726867209.99165: done getting next task for host managed_node3 12033 1726867209.99169: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12033 1726867209.99175: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867209.99205: getting variables 12033 1726867209.99206: in VariableManager get_vars() 12033 1726867209.99245: Calling all_inventory to load vars for managed_node3 12033 1726867209.99251: Calling groups_inventory to load vars for managed_node3 12033 1726867209.99253: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867209.99263: Calling all_plugins_play to load vars for managed_node3 12033 1726867209.99266: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867209.99268: Calling groups_plugins_play to load vars for managed_node3 12033 1726867209.99793: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a38 12033 1726867209.99797: WORKER PROCESS EXITING 12033 1726867210.00404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867210.01264: done with get_vars() 12033 1726867210.01280: done getting variables 12033 1726867210.01320: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:20:10 -0400 (0:00:00.042) 0:00:49.129 ****** 12033 1726867210.01346: entering _queue_task() for managed_node3/service 12033 1726867210.01553: worker is 1 (out of 1 available) 12033 1726867210.01566: exiting _queue_task() for managed_node3/service 12033 1726867210.01579: done queuing things up, now waiting for results queue to drain 12033 1726867210.01581: waiting for pending results... 12033 1726867210.01752: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12033 1726867210.01851: in run() - task 0affcac9-a3a5-74bb-502b-000000000a39 12033 1726867210.01862: variable 'ansible_search_path' from source: unknown 12033 1726867210.01866: variable 'ansible_search_path' from source: unknown 12033 1726867210.01897: calling self._execute() 12033 1726867210.01975: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867210.01981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867210.01988: variable 'omit' from source: magic vars 12033 1726867210.02255: variable 'ansible_distribution_major_version' from source: facts 12033 1726867210.02265: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867210.02347: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867210.02479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867210.04146: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867210.04187: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867210.04217: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867210.04242: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867210.04260: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867210.04322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867210.04344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867210.04361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867210.04388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867210.04399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867210.04435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867210.04452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867210.04468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867210.04495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867210.04507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867210.04538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867210.04554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867210.04570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867210.04595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867210.04607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867210.04713: variable 'network_connections' from source: task vars 12033 1726867210.04723: variable 'controller_profile' from source: play vars 12033 1726867210.04769: variable 'controller_profile' from source: play vars 12033 1726867210.04780: variable 'controller_device' from source: play vars 12033 1726867210.04822: variable 'controller_device' from source: play vars 12033 1726867210.04830: variable 'dhcp_interface1' from source: play vars 12033 1726867210.04874: variable 'dhcp_interface1' from source: play vars 12033 1726867210.04883: variable 'port1_profile' from source: play vars 12033 1726867210.04925: variable 'port1_profile' from source: play vars 12033 1726867210.04931: variable 'dhcp_interface1' from source: play vars 12033 1726867210.04975: variable 'dhcp_interface1' from source: play vars 12033 1726867210.04982: variable 'controller_profile' from source: play vars 12033 1726867210.05023: variable 'controller_profile' from source: play vars 12033 1726867210.05029: variable 'port2_profile' from source: play vars 12033 1726867210.05075: variable 'port2_profile' from source: play vars 12033 1726867210.05080: variable 'dhcp_interface2' from source: play vars 12033 1726867210.05121: variable 'dhcp_interface2' from source: play vars 12033 1726867210.05125: variable 'controller_profile' from source: play vars 12033 1726867210.05180: variable 'controller_profile' from source: play vars 12033 1726867210.05226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867210.05337: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867210.05363: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867210.05387: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867210.05411: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867210.05441: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867210.05455: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867210.05472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867210.05496: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867210.05545: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867210.05691: variable 'network_connections' from source: task vars 12033 1726867210.05695: variable 'controller_profile' from source: play vars 12033 1726867210.05739: variable 'controller_profile' from source: play vars 12033 1726867210.05746: variable 'controller_device' from source: play vars 12033 1726867210.05788: variable 'controller_device' from source: play vars 12033 1726867210.05794: variable 'dhcp_interface1' from source: play vars 12033 1726867210.05835: variable 'dhcp_interface1' from source: play vars 12033 1726867210.05846: variable 'port1_profile' from source: play vars 12033 1726867210.05886: variable 'port1_profile' from source: play vars 12033 1726867210.05891: variable 'dhcp_interface1' from source: play vars 12033 1726867210.05933: variable 'dhcp_interface1' from source: play vars 12033 1726867210.05938: variable 'controller_profile' from source: play vars 12033 1726867210.05983: variable 'controller_profile' from source: play vars 12033 1726867210.05989: variable 'port2_profile' from source: play vars 12033 1726867210.06030: variable 'port2_profile' from source: play vars 12033 1726867210.06036: variable 'dhcp_interface2' from source: play vars 12033 1726867210.06081: variable 'dhcp_interface2' from source: play vars 12033 1726867210.06086: variable 'controller_profile' from source: play vars 12033 1726867210.06127: variable 'controller_profile' from source: play vars 12033 1726867210.06150: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12033 1726867210.06153: when evaluation is False, skipping this task 12033 1726867210.06155: _execute() done 12033 1726867210.06158: dumping result to json 12033 1726867210.06160: done dumping result, returning 12033 1726867210.06171: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-74bb-502b-000000000a39] 12033 1726867210.06175: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a39 12033 1726867210.06257: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a39 12033 1726867210.06260: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12033 1726867210.06319: no more pending results, returning what we have 12033 1726867210.06322: results queue empty 12033 1726867210.06323: checking for any_errors_fatal 12033 1726867210.06328: done checking for any_errors_fatal 12033 1726867210.06329: checking for max_fail_percentage 12033 1726867210.06331: done checking for max_fail_percentage 12033 1726867210.06332: checking to see if all hosts have failed and the running result is not ok 12033 1726867210.06332: done checking to see if all hosts have failed 12033 1726867210.06333: getting the remaining hosts for this loop 12033 1726867210.06335: done getting the remaining hosts for this loop 12033 1726867210.06338: getting the next task for host managed_node3 12033 1726867210.06345: done getting next task for host managed_node3 12033 1726867210.06348: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12033 1726867210.06353: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867210.06370: getting variables 12033 1726867210.06372: in VariableManager get_vars() 12033 1726867210.06412: Calling all_inventory to load vars for managed_node3 12033 1726867210.06415: Calling groups_inventory to load vars for managed_node3 12033 1726867210.06417: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867210.06426: Calling all_plugins_play to load vars for managed_node3 12033 1726867210.06428: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867210.06431: Calling groups_plugins_play to load vars for managed_node3 12033 1726867210.07309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867210.08172: done with get_vars() 12033 1726867210.08188: done getting variables 12033 1726867210.08232: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:20:10 -0400 (0:00:00.069) 0:00:49.198 ****** 12033 1726867210.08254: entering _queue_task() for managed_node3/service 12033 1726867210.08473: worker is 1 (out of 1 available) 12033 1726867210.08488: exiting _queue_task() for managed_node3/service 12033 1726867210.08503: done queuing things up, now waiting for results queue to drain 12033 1726867210.08505: waiting for pending results... 12033 1726867210.08675: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12033 1726867210.08771: in run() - task 0affcac9-a3a5-74bb-502b-000000000a3a 12033 1726867210.08792: variable 'ansible_search_path' from source: unknown 12033 1726867210.08796: variable 'ansible_search_path' from source: unknown 12033 1726867210.08827: calling self._execute() 12033 1726867210.08902: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867210.08909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867210.08918: variable 'omit' from source: magic vars 12033 1726867210.09185: variable 'ansible_distribution_major_version' from source: facts 12033 1726867210.09195: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867210.09307: variable 'network_provider' from source: set_fact 12033 1726867210.09311: variable 'network_state' from source: role '' defaults 12033 1726867210.09321: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12033 1726867210.09327: variable 'omit' from source: magic vars 12033 1726867210.09374: variable 'omit' from source: magic vars 12033 1726867210.09397: variable 'network_service_name' from source: role '' defaults 12033 1726867210.09444: variable 'network_service_name' from source: role '' defaults 12033 1726867210.09521: variable '__network_provider_setup' from source: role '' defaults 12033 1726867210.09525: variable '__network_service_name_default_nm' from source: role '' defaults 12033 1726867210.09569: variable '__network_service_name_default_nm' from source: role '' defaults 12033 1726867210.09578: variable '__network_packages_default_nm' from source: role '' defaults 12033 1726867210.09627: variable '__network_packages_default_nm' from source: role '' defaults 12033 1726867210.09768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867210.11204: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867210.11249: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867210.11279: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867210.11315: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867210.11334: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867210.11392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867210.11416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867210.11433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867210.11462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867210.11473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867210.11509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867210.11526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867210.11542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867210.11571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867210.11584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867210.11724: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12033 1726867210.11796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867210.11816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867210.11832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867210.11856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867210.11867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867210.11933: variable 'ansible_python' from source: facts 12033 1726867210.11946: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12033 1726867210.12001: variable '__network_wpa_supplicant_required' from source: role '' defaults 12033 1726867210.12056: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12033 1726867210.12140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867210.12158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867210.12174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867210.12200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867210.12216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867210.12248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867210.12268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867210.12286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867210.12315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867210.12328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867210.12414: variable 'network_connections' from source: task vars 12033 1726867210.12420: variable 'controller_profile' from source: play vars 12033 1726867210.12476: variable 'controller_profile' from source: play vars 12033 1726867210.12487: variable 'controller_device' from source: play vars 12033 1726867210.12543: variable 'controller_device' from source: play vars 12033 1726867210.12583: variable 'dhcp_interface1' from source: play vars 12033 1726867210.12599: variable 'dhcp_interface1' from source: play vars 12033 1726867210.12613: variable 'port1_profile' from source: play vars 12033 1726867210.12663: variable 'port1_profile' from source: play vars 12033 1726867210.12672: variable 'dhcp_interface1' from source: play vars 12033 1726867210.12725: variable 'dhcp_interface1' from source: play vars 12033 1726867210.12733: variable 'controller_profile' from source: play vars 12033 1726867210.12786: variable 'controller_profile' from source: play vars 12033 1726867210.12796: variable 'port2_profile' from source: play vars 12033 1726867210.12847: variable 'port2_profile' from source: play vars 12033 1726867210.12856: variable 'dhcp_interface2' from source: play vars 12033 1726867210.12912: variable 'dhcp_interface2' from source: play vars 12033 1726867210.12921: variable 'controller_profile' from source: play vars 12033 1726867210.12969: variable 'controller_profile' from source: play vars 12033 1726867210.13042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867210.13158: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867210.13197: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867210.13238: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867210.13268: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867210.13315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867210.13336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867210.13358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867210.13381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867210.13426: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867210.13598: variable 'network_connections' from source: task vars 12033 1726867210.13605: variable 'controller_profile' from source: play vars 12033 1726867210.13660: variable 'controller_profile' from source: play vars 12033 1726867210.13669: variable 'controller_device' from source: play vars 12033 1726867210.13722: variable 'controller_device' from source: play vars 12033 1726867210.13738: variable 'dhcp_interface1' from source: play vars 12033 1726867210.13782: variable 'dhcp_interface1' from source: play vars 12033 1726867210.13793: variable 'port1_profile' from source: play vars 12033 1726867210.13846: variable 'port1_profile' from source: play vars 12033 1726867210.13857: variable 'dhcp_interface1' from source: play vars 12033 1726867210.13906: variable 'dhcp_interface1' from source: play vars 12033 1726867210.13915: variable 'controller_profile' from source: play vars 12033 1726867210.13968: variable 'controller_profile' from source: play vars 12033 1726867210.13979: variable 'port2_profile' from source: play vars 12033 1726867210.14029: variable 'port2_profile' from source: play vars 12033 1726867210.14038: variable 'dhcp_interface2' from source: play vars 12033 1726867210.14092: variable 'dhcp_interface2' from source: play vars 12033 1726867210.14101: variable 'controller_profile' from source: play vars 12033 1726867210.14150: variable 'controller_profile' from source: play vars 12033 1726867210.14187: variable '__network_packages_default_wireless' from source: role '' defaults 12033 1726867210.14240: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867210.14423: variable 'network_connections' from source: task vars 12033 1726867210.14426: variable 'controller_profile' from source: play vars 12033 1726867210.14474: variable 'controller_profile' from source: play vars 12033 1726867210.14482: variable 'controller_device' from source: play vars 12033 1726867210.14536: variable 'controller_device' from source: play vars 12033 1726867210.14542: variable 'dhcp_interface1' from source: play vars 12033 1726867210.14601: variable 'dhcp_interface1' from source: play vars 12033 1726867210.14610: variable 'port1_profile' from source: play vars 12033 1726867210.14657: variable 'port1_profile' from source: play vars 12033 1726867210.14662: variable 'dhcp_interface1' from source: play vars 12033 1726867210.14715: variable 'dhcp_interface1' from source: play vars 12033 1726867210.14718: variable 'controller_profile' from source: play vars 12033 1726867210.14768: variable 'controller_profile' from source: play vars 12033 1726867210.14774: variable 'port2_profile' from source: play vars 12033 1726867210.14824: variable 'port2_profile' from source: play vars 12033 1726867210.14837: variable 'dhcp_interface2' from source: play vars 12033 1726867210.14882: variable 'dhcp_interface2' from source: play vars 12033 1726867210.14888: variable 'controller_profile' from source: play vars 12033 1726867210.14940: variable 'controller_profile' from source: play vars 12033 1726867210.14959: variable '__network_packages_default_team' from source: role '' defaults 12033 1726867210.15015: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867210.15198: variable 'network_connections' from source: task vars 12033 1726867210.15201: variable 'controller_profile' from source: play vars 12033 1726867210.15253: variable 'controller_profile' from source: play vars 12033 1726867210.15258: variable 'controller_device' from source: play vars 12033 1726867210.15312: variable 'controller_device' from source: play vars 12033 1726867210.15319: variable 'dhcp_interface1' from source: play vars 12033 1726867210.15373: variable 'dhcp_interface1' from source: play vars 12033 1726867210.15382: variable 'port1_profile' from source: play vars 12033 1726867210.15433: variable 'port1_profile' from source: play vars 12033 1726867210.15439: variable 'dhcp_interface1' from source: play vars 12033 1726867210.15517: variable 'dhcp_interface1' from source: play vars 12033 1726867210.15520: variable 'controller_profile' from source: play vars 12033 1726867210.15564: variable 'controller_profile' from source: play vars 12033 1726867210.15571: variable 'port2_profile' from source: play vars 12033 1726867210.15626: variable 'port2_profile' from source: play vars 12033 1726867210.15629: variable 'dhcp_interface2' from source: play vars 12033 1726867210.15692: variable 'dhcp_interface2' from source: play vars 12033 1726867210.15696: variable 'controller_profile' from source: play vars 12033 1726867210.15737: variable 'controller_profile' from source: play vars 12033 1726867210.15781: variable '__network_service_name_default_initscripts' from source: role '' defaults 12033 1726867210.15823: variable '__network_service_name_default_initscripts' from source: role '' defaults 12033 1726867210.15830: variable '__network_packages_default_initscripts' from source: role '' defaults 12033 1726867210.15872: variable '__network_packages_default_initscripts' from source: role '' defaults 12033 1726867210.16006: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12033 1726867210.16306: variable 'network_connections' from source: task vars 12033 1726867210.16310: variable 'controller_profile' from source: play vars 12033 1726867210.16351: variable 'controller_profile' from source: play vars 12033 1726867210.16356: variable 'controller_device' from source: play vars 12033 1726867210.16403: variable 'controller_device' from source: play vars 12033 1726867210.16407: variable 'dhcp_interface1' from source: play vars 12033 1726867210.16450: variable 'dhcp_interface1' from source: play vars 12033 1726867210.16457: variable 'port1_profile' from source: play vars 12033 1726867210.16504: variable 'port1_profile' from source: play vars 12033 1726867210.16507: variable 'dhcp_interface1' from source: play vars 12033 1726867210.16548: variable 'dhcp_interface1' from source: play vars 12033 1726867210.16553: variable 'controller_profile' from source: play vars 12033 1726867210.16598: variable 'controller_profile' from source: play vars 12033 1726867210.16604: variable 'port2_profile' from source: play vars 12033 1726867210.16645: variable 'port2_profile' from source: play vars 12033 1726867210.16651: variable 'dhcp_interface2' from source: play vars 12033 1726867210.16694: variable 'dhcp_interface2' from source: play vars 12033 1726867210.16702: variable 'controller_profile' from source: play vars 12033 1726867210.16744: variable 'controller_profile' from source: play vars 12033 1726867210.16750: variable 'ansible_distribution' from source: facts 12033 1726867210.16753: variable '__network_rh_distros' from source: role '' defaults 12033 1726867210.16759: variable 'ansible_distribution_major_version' from source: facts 12033 1726867210.16776: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12033 1726867210.16905: variable 'ansible_distribution' from source: facts 12033 1726867210.16909: variable '__network_rh_distros' from source: role '' defaults 12033 1726867210.16911: variable 'ansible_distribution_major_version' from source: facts 12033 1726867210.16923: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12033 1726867210.17034: variable 'ansible_distribution' from source: facts 12033 1726867210.17037: variable '__network_rh_distros' from source: role '' defaults 12033 1726867210.17042: variable 'ansible_distribution_major_version' from source: facts 12033 1726867210.17070: variable 'network_provider' from source: set_fact 12033 1726867210.17088: variable 'omit' from source: magic vars 12033 1726867210.17109: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867210.17130: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867210.17144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867210.17157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867210.17182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867210.17285: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867210.17289: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867210.17291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867210.17335: Set connection var ansible_pipelining to False 12033 1726867210.17349: Set connection var ansible_shell_executable to /bin/sh 12033 1726867210.17361: Set connection var ansible_timeout to 10 12033 1726867210.17370: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867210.17379: Set connection var ansible_connection to ssh 12033 1726867210.17389: Set connection var ansible_shell_type to sh 12033 1726867210.17416: variable 'ansible_shell_executable' from source: unknown 12033 1726867210.17424: variable 'ansible_connection' from source: unknown 12033 1726867210.17432: variable 'ansible_module_compression' from source: unknown 12033 1726867210.17439: variable 'ansible_shell_type' from source: unknown 12033 1726867210.17446: variable 'ansible_shell_executable' from source: unknown 12033 1726867210.17682: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867210.17685: variable 'ansible_pipelining' from source: unknown 12033 1726867210.17688: variable 'ansible_timeout' from source: unknown 12033 1726867210.17690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867210.17693: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867210.17695: variable 'omit' from source: magic vars 12033 1726867210.17697: starting attempt loop 12033 1726867210.17699: running the handler 12033 1726867210.17703: variable 'ansible_facts' from source: unknown 12033 1726867210.18484: _low_level_execute_command(): starting 12033 1726867210.18491: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867210.18950: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867210.18954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867210.18957: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867210.18959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867210.19023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867210.19026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867210.19027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867210.19072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867210.20815: stdout chunk (state=3): >>>/root <<< 12033 1726867210.20883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867210.20907: stderr chunk (state=3): >>><<< 12033 1726867210.20916: stdout chunk (state=3): >>><<< 12033 1726867210.20938: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867210.21012: _low_level_execute_command(): starting 12033 1726867210.21017: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867210.2094402-14435-204932478101392 `" && echo ansible-tmp-1726867210.2094402-14435-204932478101392="` echo /root/.ansible/tmp/ansible-tmp-1726867210.2094402-14435-204932478101392 `" ) && sleep 0' 12033 1726867210.21536: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867210.21557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867210.21573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867210.21597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867210.21613: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867210.21676: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867210.21722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867210.21737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867210.21756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867210.21839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867210.23720: stdout chunk (state=3): >>>ansible-tmp-1726867210.2094402-14435-204932478101392=/root/.ansible/tmp/ansible-tmp-1726867210.2094402-14435-204932478101392 <<< 12033 1726867210.23884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867210.23887: stdout chunk (state=3): >>><<< 12033 1726867210.23889: stderr chunk (state=3): >>><<< 12033 1726867210.23908: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867210.2094402-14435-204932478101392=/root/.ansible/tmp/ansible-tmp-1726867210.2094402-14435-204932478101392 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867210.24038: variable 'ansible_module_compression' from source: unknown 12033 1726867210.24041: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 12033 1726867210.24076: variable 'ansible_facts' from source: unknown 12033 1726867210.24315: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867210.2094402-14435-204932478101392/AnsiballZ_systemd.py 12033 1726867210.24575: Sending initial data 12033 1726867210.24581: Sent initial data (156 bytes) 12033 1726867210.25185: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867210.25217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867210.25293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867210.26815: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867210.26874: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867210.26944: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpgbstko11 /root/.ansible/tmp/ansible-tmp-1726867210.2094402-14435-204932478101392/AnsiballZ_systemd.py <<< 12033 1726867210.26967: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867210.2094402-14435-204932478101392/AnsiballZ_systemd.py" <<< 12033 1726867210.27003: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpgbstko11" to remote "/root/.ansible/tmp/ansible-tmp-1726867210.2094402-14435-204932478101392/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867210.2094402-14435-204932478101392/AnsiballZ_systemd.py" <<< 12033 1726867210.28604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867210.28608: stdout chunk (state=3): >>><<< 12033 1726867210.28610: stderr chunk (state=3): >>><<< 12033 1726867210.28612: done transferring module to remote 12033 1726867210.28614: _low_level_execute_command(): starting 12033 1726867210.28617: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867210.2094402-14435-204932478101392/ /root/.ansible/tmp/ansible-tmp-1726867210.2094402-14435-204932478101392/AnsiballZ_systemd.py && sleep 0' 12033 1726867210.29199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867210.29268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867210.29327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867210.29348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867210.29426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867210.31203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867210.31218: stdout chunk (state=3): >>><<< 12033 1726867210.31229: stderr chunk (state=3): >>><<< 12033 1726867210.31247: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867210.31255: _low_level_execute_command(): starting 12033 1726867210.31263: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867210.2094402-14435-204932478101392/AnsiballZ_systemd.py && sleep 0' 12033 1726867210.31869: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867210.31887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867210.31900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867210.31918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867210.31993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867210.32039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867210.32060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867210.32085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867210.32164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867210.61291: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10375168", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3321229312", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "857038000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 12033 1726867210.61322: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12033 1726867210.63160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867210.63174: stderr chunk (state=3): >>><<< 12033 1726867210.63383: stdout chunk (state=3): >>><<< 12033 1726867210.63389: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10375168", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3321229312", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "857038000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867210.63417: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867210.2094402-14435-204932478101392/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867210.63442: _low_level_execute_command(): starting 12033 1726867210.63452: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867210.2094402-14435-204932478101392/ > /dev/null 2>&1 && sleep 0' 12033 1726867210.64063: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867210.64075: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867210.64095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867210.64117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867210.64134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867210.64145: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867210.64195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867210.64255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867210.64273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867210.64297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867210.64369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867210.66207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867210.66248: stderr chunk (state=3): >>><<< 12033 1726867210.66258: stdout chunk (state=3): >>><<< 12033 1726867210.66276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867210.66293: handler run complete 12033 1726867210.66384: attempt loop complete, returning result 12033 1726867210.66400: _execute() done 12033 1726867210.66407: dumping result to json 12033 1726867210.66430: done dumping result, returning 12033 1726867210.66444: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-74bb-502b-000000000a3a] 12033 1726867210.66483: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a3a ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867210.66893: no more pending results, returning what we have 12033 1726867210.66898: results queue empty 12033 1726867210.66899: checking for any_errors_fatal 12033 1726867210.66908: done checking for any_errors_fatal 12033 1726867210.66908: checking for max_fail_percentage 12033 1726867210.66913: done checking for max_fail_percentage 12033 1726867210.66914: checking to see if all hosts have failed and the running result is not ok 12033 1726867210.66915: done checking to see if all hosts have failed 12033 1726867210.66916: getting the remaining hosts for this loop 12033 1726867210.66918: done getting the remaining hosts for this loop 12033 1726867210.66921: getting the next task for host managed_node3 12033 1726867210.66929: done getting next task for host managed_node3 12033 1726867210.66933: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12033 1726867210.66940: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867210.66952: getting variables 12033 1726867210.66954: in VariableManager get_vars() 12033 1726867210.67031: Calling all_inventory to load vars for managed_node3 12033 1726867210.67034: Calling groups_inventory to load vars for managed_node3 12033 1726867210.67037: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867210.67048: Calling all_plugins_play to load vars for managed_node3 12033 1726867210.67051: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867210.67055: Calling groups_plugins_play to load vars for managed_node3 12033 1726867210.67591: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a3a 12033 1726867210.67594: WORKER PROCESS EXITING 12033 1726867210.68849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867210.70637: done with get_vars() 12033 1726867210.70660: done getting variables 12033 1726867210.70718: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:20:10 -0400 (0:00:00.624) 0:00:49.823 ****** 12033 1726867210.70758: entering _queue_task() for managed_node3/service 12033 1726867210.71197: worker is 1 (out of 1 available) 12033 1726867210.71210: exiting _queue_task() for managed_node3/service 12033 1726867210.71221: done queuing things up, now waiting for results queue to drain 12033 1726867210.71223: waiting for pending results... 12033 1726867210.71506: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12033 1726867210.71784: in run() - task 0affcac9-a3a5-74bb-502b-000000000a3b 12033 1726867210.71789: variable 'ansible_search_path' from source: unknown 12033 1726867210.71791: variable 'ansible_search_path' from source: unknown 12033 1726867210.71794: calling self._execute() 12033 1726867210.71883: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867210.71898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867210.71918: variable 'omit' from source: magic vars 12033 1726867210.72337: variable 'ansible_distribution_major_version' from source: facts 12033 1726867210.72363: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867210.72495: variable 'network_provider' from source: set_fact 12033 1726867210.72507: Evaluated conditional (network_provider == "nm"): True 12033 1726867210.72604: variable '__network_wpa_supplicant_required' from source: role '' defaults 12033 1726867210.72700: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12033 1726867210.72875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867210.75181: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867210.75185: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867210.75187: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867210.75227: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867210.75260: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867210.75375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867210.75428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867210.75464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867210.75525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867210.75557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867210.75620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867210.75661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867210.75697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867210.75749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867210.75769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867210.75827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867210.75850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867210.75936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867210.75939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867210.75942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867210.76087: variable 'network_connections' from source: task vars 12033 1726867210.76111: variable 'controller_profile' from source: play vars 12033 1726867210.76199: variable 'controller_profile' from source: play vars 12033 1726867210.76224: variable 'controller_device' from source: play vars 12033 1726867210.76292: variable 'controller_device' from source: play vars 12033 1726867210.76306: variable 'dhcp_interface1' from source: play vars 12033 1726867210.76369: variable 'dhcp_interface1' from source: play vars 12033 1726867210.76388: variable 'port1_profile' from source: play vars 12033 1726867210.76617: variable 'port1_profile' from source: play vars 12033 1726867210.76620: variable 'dhcp_interface1' from source: play vars 12033 1726867210.76622: variable 'dhcp_interface1' from source: play vars 12033 1726867210.76624: variable 'controller_profile' from source: play vars 12033 1726867210.76626: variable 'controller_profile' from source: play vars 12033 1726867210.76628: variable 'port2_profile' from source: play vars 12033 1726867210.76797: variable 'port2_profile' from source: play vars 12033 1726867210.76810: variable 'dhcp_interface2' from source: play vars 12033 1726867210.76873: variable 'dhcp_interface2' from source: play vars 12033 1726867210.76888: variable 'controller_profile' from source: play vars 12033 1726867210.76955: variable 'controller_profile' from source: play vars 12033 1726867210.77028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867210.77182: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867210.77227: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867210.77261: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867210.77294: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867210.77342: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867210.77365: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867210.77403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867210.77432: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867210.77494: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867210.77751: variable 'network_connections' from source: task vars 12033 1726867210.77765: variable 'controller_profile' from source: play vars 12033 1726867210.77826: variable 'controller_profile' from source: play vars 12033 1726867210.77838: variable 'controller_device' from source: play vars 12033 1726867210.77904: variable 'controller_device' from source: play vars 12033 1726867210.77983: variable 'dhcp_interface1' from source: play vars 12033 1726867210.77986: variable 'dhcp_interface1' from source: play vars 12033 1726867210.77989: variable 'port1_profile' from source: play vars 12033 1726867210.78048: variable 'port1_profile' from source: play vars 12033 1726867210.78061: variable 'dhcp_interface1' from source: play vars 12033 1726867210.78127: variable 'dhcp_interface1' from source: play vars 12033 1726867210.78138: variable 'controller_profile' from source: play vars 12033 1726867210.78203: variable 'controller_profile' from source: play vars 12033 1726867210.78216: variable 'port2_profile' from source: play vars 12033 1726867210.78275: variable 'port2_profile' from source: play vars 12033 1726867210.78290: variable 'dhcp_interface2' from source: play vars 12033 1726867210.78353: variable 'dhcp_interface2' from source: play vars 12033 1726867210.78366: variable 'controller_profile' from source: play vars 12033 1726867210.78432: variable 'controller_profile' from source: play vars 12033 1726867210.78526: Evaluated conditional (__network_wpa_supplicant_required): False 12033 1726867210.78529: when evaluation is False, skipping this task 12033 1726867210.78531: _execute() done 12033 1726867210.78534: dumping result to json 12033 1726867210.78536: done dumping result, returning 12033 1726867210.78538: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-74bb-502b-000000000a3b] 12033 1726867210.78540: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a3b skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12033 1726867210.78679: no more pending results, returning what we have 12033 1726867210.78683: results queue empty 12033 1726867210.78684: checking for any_errors_fatal 12033 1726867210.78701: done checking for any_errors_fatal 12033 1726867210.78702: checking for max_fail_percentage 12033 1726867210.78704: done checking for max_fail_percentage 12033 1726867210.78705: checking to see if all hosts have failed and the running result is not ok 12033 1726867210.78706: done checking to see if all hosts have failed 12033 1726867210.78707: getting the remaining hosts for this loop 12033 1726867210.78709: done getting the remaining hosts for this loop 12033 1726867210.78713: getting the next task for host managed_node3 12033 1726867210.78722: done getting next task for host managed_node3 12033 1726867210.78725: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12033 1726867210.78731: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867210.78752: getting variables 12033 1726867210.78754: in VariableManager get_vars() 12033 1726867210.78799: Calling all_inventory to load vars for managed_node3 12033 1726867210.78801: Calling groups_inventory to load vars for managed_node3 12033 1726867210.78804: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867210.78814: Calling all_plugins_play to load vars for managed_node3 12033 1726867210.78818: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867210.78821: Calling groups_plugins_play to load vars for managed_node3 12033 1726867210.79590: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a3b 12033 1726867210.79594: WORKER PROCESS EXITING 12033 1726867210.80416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867210.81938: done with get_vars() 12033 1726867210.81961: done getting variables 12033 1726867210.82022: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:20:10 -0400 (0:00:00.112) 0:00:49.936 ****** 12033 1726867210.82058: entering _queue_task() for managed_node3/service 12033 1726867210.82608: worker is 1 (out of 1 available) 12033 1726867210.82617: exiting _queue_task() for managed_node3/service 12033 1726867210.82627: done queuing things up, now waiting for results queue to drain 12033 1726867210.82628: waiting for pending results... 12033 1726867210.82695: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 12033 1726867210.82864: in run() - task 0affcac9-a3a5-74bb-502b-000000000a3c 12033 1726867210.82887: variable 'ansible_search_path' from source: unknown 12033 1726867210.82895: variable 'ansible_search_path' from source: unknown 12033 1726867210.82940: calling self._execute() 12033 1726867210.83040: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867210.83055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867210.83074: variable 'omit' from source: magic vars 12033 1726867210.83452: variable 'ansible_distribution_major_version' from source: facts 12033 1726867210.83472: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867210.83593: variable 'network_provider' from source: set_fact 12033 1726867210.83607: Evaluated conditional (network_provider == "initscripts"): False 12033 1726867210.83614: when evaluation is False, skipping this task 12033 1726867210.83620: _execute() done 12033 1726867210.83626: dumping result to json 12033 1726867210.83632: done dumping result, returning 12033 1726867210.83641: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-74bb-502b-000000000a3c] 12033 1726867210.83650: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a3c skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867210.83785: no more pending results, returning what we have 12033 1726867210.83790: results queue empty 12033 1726867210.83791: checking for any_errors_fatal 12033 1726867210.83800: done checking for any_errors_fatal 12033 1726867210.83801: checking for max_fail_percentage 12033 1726867210.83803: done checking for max_fail_percentage 12033 1726867210.83804: checking to see if all hosts have failed and the running result is not ok 12033 1726867210.83805: done checking to see if all hosts have failed 12033 1726867210.83806: getting the remaining hosts for this loop 12033 1726867210.83808: done getting the remaining hosts for this loop 12033 1726867210.83812: getting the next task for host managed_node3 12033 1726867210.83819: done getting next task for host managed_node3 12033 1726867210.83822: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12033 1726867210.83828: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867210.83850: getting variables 12033 1726867210.83851: in VariableManager get_vars() 12033 1726867210.83894: Calling all_inventory to load vars for managed_node3 12033 1726867210.83897: Calling groups_inventory to load vars for managed_node3 12033 1726867210.83900: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867210.83911: Calling all_plugins_play to load vars for managed_node3 12033 1726867210.83914: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867210.83917: Calling groups_plugins_play to load vars for managed_node3 12033 1726867210.84690: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a3c 12033 1726867210.84693: WORKER PROCESS EXITING 12033 1726867210.85590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867210.87060: done with get_vars() 12033 1726867210.87082: done getting variables 12033 1726867210.87137: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:20:10 -0400 (0:00:00.051) 0:00:49.987 ****** 12033 1726867210.87173: entering _queue_task() for managed_node3/copy 12033 1726867210.87458: worker is 1 (out of 1 available) 12033 1726867210.87472: exiting _queue_task() for managed_node3/copy 12033 1726867210.87687: done queuing things up, now waiting for results queue to drain 12033 1726867210.87689: waiting for pending results... 12033 1726867210.87770: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12033 1726867210.87918: in run() - task 0affcac9-a3a5-74bb-502b-000000000a3d 12033 1726867210.87936: variable 'ansible_search_path' from source: unknown 12033 1726867210.87943: variable 'ansible_search_path' from source: unknown 12033 1726867210.87982: calling self._execute() 12033 1726867210.88079: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867210.88091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867210.88102: variable 'omit' from source: magic vars 12033 1726867210.88470: variable 'ansible_distribution_major_version' from source: facts 12033 1726867210.88489: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867210.88610: variable 'network_provider' from source: set_fact 12033 1726867210.88621: Evaluated conditional (network_provider == "initscripts"): False 12033 1726867210.88628: when evaluation is False, skipping this task 12033 1726867210.88634: _execute() done 12033 1726867210.88641: dumping result to json 12033 1726867210.88647: done dumping result, returning 12033 1726867210.88658: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-74bb-502b-000000000a3d] 12033 1726867210.88673: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a3d skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12033 1726867210.88840: no more pending results, returning what we have 12033 1726867210.88846: results queue empty 12033 1726867210.88847: checking for any_errors_fatal 12033 1726867210.88855: done checking for any_errors_fatal 12033 1726867210.88855: checking for max_fail_percentage 12033 1726867210.88858: done checking for max_fail_percentage 12033 1726867210.88859: checking to see if all hosts have failed and the running result is not ok 12033 1726867210.88860: done checking to see if all hosts have failed 12033 1726867210.88860: getting the remaining hosts for this loop 12033 1726867210.88862: done getting the remaining hosts for this loop 12033 1726867210.88866: getting the next task for host managed_node3 12033 1726867210.88875: done getting next task for host managed_node3 12033 1726867210.88881: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12033 1726867210.88888: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867210.88912: getting variables 12033 1726867210.88914: in VariableManager get_vars() 12033 1726867210.88959: Calling all_inventory to load vars for managed_node3 12033 1726867210.88962: Calling groups_inventory to load vars for managed_node3 12033 1726867210.88964: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867210.89081: Calling all_plugins_play to load vars for managed_node3 12033 1726867210.89087: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867210.89093: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a3d 12033 1726867210.89096: WORKER PROCESS EXITING 12033 1726867210.89100: Calling groups_plugins_play to load vars for managed_node3 12033 1726867210.90598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867210.92109: done with get_vars() 12033 1726867210.92138: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:20:10 -0400 (0:00:00.050) 0:00:50.038 ****** 12033 1726867210.92239: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 12033 1726867210.92610: worker is 1 (out of 1 available) 12033 1726867210.92624: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 12033 1726867210.92638: done queuing things up, now waiting for results queue to drain 12033 1726867210.92640: waiting for pending results... 12033 1726867210.92946: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12033 1726867210.93118: in run() - task 0affcac9-a3a5-74bb-502b-000000000a3e 12033 1726867210.93141: variable 'ansible_search_path' from source: unknown 12033 1726867210.93149: variable 'ansible_search_path' from source: unknown 12033 1726867210.93195: calling self._execute() 12033 1726867210.93303: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867210.93321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867210.93335: variable 'omit' from source: magic vars 12033 1726867210.93712: variable 'ansible_distribution_major_version' from source: facts 12033 1726867210.93754: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867210.93758: variable 'omit' from source: magic vars 12033 1726867210.93822: variable 'omit' from source: magic vars 12033 1726867210.93997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867210.96383: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867210.96395: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867210.96436: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867210.96472: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867210.96502: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867210.96579: variable 'network_provider' from source: set_fact 12033 1726867210.96710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867210.96744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867210.96771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867210.96816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867210.96849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867210.96913: variable 'omit' from source: magic vars 12033 1726867210.97182: variable 'omit' from source: magic vars 12033 1726867210.97185: variable 'network_connections' from source: task vars 12033 1726867210.97187: variable 'controller_profile' from source: play vars 12033 1726867210.97192: variable 'controller_profile' from source: play vars 12033 1726867210.97205: variable 'controller_device' from source: play vars 12033 1726867210.97265: variable 'controller_device' from source: play vars 12033 1726867210.97281: variable 'dhcp_interface1' from source: play vars 12033 1726867210.97345: variable 'dhcp_interface1' from source: play vars 12033 1726867210.97358: variable 'port1_profile' from source: play vars 12033 1726867210.97425: variable 'port1_profile' from source: play vars 12033 1726867210.97437: variable 'dhcp_interface1' from source: play vars 12033 1726867210.97502: variable 'dhcp_interface1' from source: play vars 12033 1726867210.97516: variable 'controller_profile' from source: play vars 12033 1726867210.97581: variable 'controller_profile' from source: play vars 12033 1726867210.97593: variable 'port2_profile' from source: play vars 12033 1726867210.97661: variable 'port2_profile' from source: play vars 12033 1726867210.97672: variable 'dhcp_interface2' from source: play vars 12033 1726867210.97733: variable 'dhcp_interface2' from source: play vars 12033 1726867210.97750: variable 'controller_profile' from source: play vars 12033 1726867210.97813: variable 'controller_profile' from source: play vars 12033 1726867210.98014: variable 'omit' from source: magic vars 12033 1726867210.98027: variable '__lsr_ansible_managed' from source: task vars 12033 1726867210.98093: variable '__lsr_ansible_managed' from source: task vars 12033 1726867210.98280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12033 1726867210.98484: Loaded config def from plugin (lookup/template) 12033 1726867210.98496: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12033 1726867210.98526: File lookup term: get_ansible_managed.j2 12033 1726867210.98533: variable 'ansible_search_path' from source: unknown 12033 1726867210.98542: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12033 1726867210.98558: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12033 1726867210.98581: variable 'ansible_search_path' from source: unknown 12033 1726867211.10458: variable 'ansible_managed' from source: unknown 12033 1726867211.10671: variable 'omit' from source: magic vars 12033 1726867211.10674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867211.10679: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867211.10681: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867211.10683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867211.10692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867211.10716: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867211.10725: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867211.10734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867211.10829: Set connection var ansible_pipelining to False 12033 1726867211.10843: Set connection var ansible_shell_executable to /bin/sh 12033 1726867211.10865: Set connection var ansible_timeout to 10 12033 1726867211.10875: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867211.10887: Set connection var ansible_connection to ssh 12033 1726867211.10897: Set connection var ansible_shell_type to sh 12033 1726867211.10922: variable 'ansible_shell_executable' from source: unknown 12033 1726867211.10991: variable 'ansible_connection' from source: unknown 12033 1726867211.10994: variable 'ansible_module_compression' from source: unknown 12033 1726867211.10996: variable 'ansible_shell_type' from source: unknown 12033 1726867211.10997: variable 'ansible_shell_executable' from source: unknown 12033 1726867211.10999: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867211.11001: variable 'ansible_pipelining' from source: unknown 12033 1726867211.11002: variable 'ansible_timeout' from source: unknown 12033 1726867211.11004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867211.11073: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867211.11088: variable 'omit' from source: magic vars 12033 1726867211.11103: starting attempt loop 12033 1726867211.11110: running the handler 12033 1726867211.11124: _low_level_execute_command(): starting 12033 1726867211.11133: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867211.11759: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867211.11793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867211.11883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867211.11903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867211.11919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867211.11947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867211.12031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867211.13705: stdout chunk (state=3): >>>/root <<< 12033 1726867211.13828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867211.13867: stderr chunk (state=3): >>><<< 12033 1726867211.13871: stdout chunk (state=3): >>><<< 12033 1726867211.13894: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867211.13909: _low_level_execute_command(): starting 12033 1726867211.13914: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867211.1389208-14463-159557752447223 `" && echo ansible-tmp-1726867211.1389208-14463-159557752447223="` echo /root/.ansible/tmp/ansible-tmp-1726867211.1389208-14463-159557752447223 `" ) && sleep 0' 12033 1726867211.14532: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867211.14583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867211.14586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867211.14589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867211.14592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867211.14594: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867211.14596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867211.14607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867211.14615: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867211.14624: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867211.14702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867211.14749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867211.14796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867211.16657: stdout chunk (state=3): >>>ansible-tmp-1726867211.1389208-14463-159557752447223=/root/.ansible/tmp/ansible-tmp-1726867211.1389208-14463-159557752447223 <<< 12033 1726867211.16814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867211.16817: stdout chunk (state=3): >>><<< 12033 1726867211.16819: stderr chunk (state=3): >>><<< 12033 1726867211.16983: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867211.1389208-14463-159557752447223=/root/.ansible/tmp/ansible-tmp-1726867211.1389208-14463-159557752447223 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867211.16991: variable 'ansible_module_compression' from source: unknown 12033 1726867211.16994: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 12033 1726867211.16996: variable 'ansible_facts' from source: unknown 12033 1726867211.17137: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867211.1389208-14463-159557752447223/AnsiballZ_network_connections.py 12033 1726867211.17352: Sending initial data 12033 1726867211.17355: Sent initial data (168 bytes) 12033 1726867211.17970: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867211.18041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867211.18053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867211.18062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867211.18152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867211.19676: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867211.19760: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867211.19818: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmp6zt2rtnb /root/.ansible/tmp/ansible-tmp-1726867211.1389208-14463-159557752447223/AnsiballZ_network_connections.py <<< 12033 1726867211.19822: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867211.1389208-14463-159557752447223/AnsiballZ_network_connections.py" <<< 12033 1726867211.19880: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmp6zt2rtnb" to remote "/root/.ansible/tmp/ansible-tmp-1726867211.1389208-14463-159557752447223/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867211.1389208-14463-159557752447223/AnsiballZ_network_connections.py" <<< 12033 1726867211.21060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867211.21184: stdout chunk (state=3): >>><<< 12033 1726867211.21188: stderr chunk (state=3): >>><<< 12033 1726867211.21190: done transferring module to remote 12033 1726867211.21192: _low_level_execute_command(): starting 12033 1726867211.21195: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867211.1389208-14463-159557752447223/ /root/.ansible/tmp/ansible-tmp-1726867211.1389208-14463-159557752447223/AnsiballZ_network_connections.py && sleep 0' 12033 1726867211.21957: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867211.22026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867211.22029: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867211.22051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867211.22134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867211.23987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867211.23990: stdout chunk (state=3): >>><<< 12033 1726867211.23992: stderr chunk (state=3): >>><<< 12033 1726867211.23998: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867211.24001: _low_level_execute_command(): starting 12033 1726867211.24004: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867211.1389208-14463-159557752447223/AnsiballZ_network_connections.py && sleep 0' 12033 1726867211.24567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867211.24575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867211.24587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867211.24640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867211.24643: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867211.24651: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867211.24653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867211.24656: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867211.24658: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867211.24661: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867211.24663: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867211.24665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867211.24679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867211.24749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867211.24753: stderr chunk (state=3): >>>debug2: match found <<< 12033 1726867211.24755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867211.24764: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867211.24775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867211.24792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867211.24863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867211.65953: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 560fcc66-c438-49a7-835e-aed531442b3e\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 67803b09-25b3-4aa2-bd80-0afe9d2728fb\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 3fbfe47f-1f56-4efb-93c9-a69913e19a91\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 560fcc66-c438-49a7-835e-aed531442b3e (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 67803b09-25b3-4aa2-bd80-0afe9d2728fb (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 3fbfe47f-1f56-4efb-93c9-a69913e19a91 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12033 1726867211.68084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867211.68089: stdout chunk (state=3): >>><<< 12033 1726867211.68091: stderr chunk (state=3): >>><<< 12033 1726867211.68390: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 560fcc66-c438-49a7-835e-aed531442b3e\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 67803b09-25b3-4aa2-bd80-0afe9d2728fb\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 3fbfe47f-1f56-4efb-93c9-a69913e19a91\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 560fcc66-c438-49a7-835e-aed531442b3e (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 67803b09-25b3-4aa2-bd80-0afe9d2728fb (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 3fbfe47f-1f56-4efb-93c9-a69913e19a91 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867211.68394: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'arp_interval': 60, 'arp_ip_target': '192.0.2.128', 'arp_validate': 'none', 'primary': 'test1'}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867211.1389208-14463-159557752447223/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867211.68403: _low_level_execute_command(): starting 12033 1726867211.68406: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867211.1389208-14463-159557752447223/ > /dev/null 2>&1 && sleep 0' 12033 1726867211.69118: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867211.69128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867211.69139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867211.69160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867211.69172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867211.69185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867211.69228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867211.69240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867211.69289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867211.71341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867211.71345: stdout chunk (state=3): >>><<< 12033 1726867211.71347: stderr chunk (state=3): >>><<< 12033 1726867211.71349: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867211.71352: handler run complete 12033 1726867211.71485: attempt loop complete, returning result 12033 1726867211.71490: _execute() done 12033 1726867211.71492: dumping result to json 12033 1726867211.71494: done dumping result, returning 12033 1726867211.71496: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-74bb-502b-000000000a3e] 12033 1726867211.71498: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a3e changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 560fcc66-c438-49a7-835e-aed531442b3e [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 67803b09-25b3-4aa2-bd80-0afe9d2728fb [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 3fbfe47f-1f56-4efb-93c9-a69913e19a91 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 560fcc66-c438-49a7-835e-aed531442b3e (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 67803b09-25b3-4aa2-bd80-0afe9d2728fb (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 3fbfe47f-1f56-4efb-93c9-a69913e19a91 (not-active) 12033 1726867211.71761: no more pending results, returning what we have 12033 1726867211.71765: results queue empty 12033 1726867211.71766: checking for any_errors_fatal 12033 1726867211.71771: done checking for any_errors_fatal 12033 1726867211.71772: checking for max_fail_percentage 12033 1726867211.71774: done checking for max_fail_percentage 12033 1726867211.71775: checking to see if all hosts have failed and the running result is not ok 12033 1726867211.71776: done checking to see if all hosts have failed 12033 1726867211.71981: getting the remaining hosts for this loop 12033 1726867211.71984: done getting the remaining hosts for this loop 12033 1726867211.71988: getting the next task for host managed_node3 12033 1726867211.71996: done getting next task for host managed_node3 12033 1726867211.72002: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12033 1726867211.72007: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867211.72013: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a3e 12033 1726867211.72024: WORKER PROCESS EXITING 12033 1726867211.72033: getting variables 12033 1726867211.72034: in VariableManager get_vars() 12033 1726867211.72074: Calling all_inventory to load vars for managed_node3 12033 1726867211.72079: Calling groups_inventory to load vars for managed_node3 12033 1726867211.72081: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867211.72090: Calling all_plugins_play to load vars for managed_node3 12033 1726867211.72093: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867211.72095: Calling groups_plugins_play to load vars for managed_node3 12033 1726867211.73314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867211.74309: done with get_vars() 12033 1726867211.74330: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:20:11 -0400 (0:00:00.821) 0:00:50.859 ****** 12033 1726867211.74423: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 12033 1726867211.74759: worker is 1 (out of 1 available) 12033 1726867211.74774: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 12033 1726867211.74788: done queuing things up, now waiting for results queue to drain 12033 1726867211.74790: waiting for pending results... 12033 1726867211.75031: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 12033 1726867211.75214: in run() - task 0affcac9-a3a5-74bb-502b-000000000a3f 12033 1726867211.75244: variable 'ansible_search_path' from source: unknown 12033 1726867211.75255: variable 'ansible_search_path' from source: unknown 12033 1726867211.75299: calling self._execute() 12033 1726867211.75457: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867211.75461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867211.75464: variable 'omit' from source: magic vars 12033 1726867211.75819: variable 'ansible_distribution_major_version' from source: facts 12033 1726867211.75837: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867211.75974: variable 'network_state' from source: role '' defaults 12033 1726867211.75990: Evaluated conditional (network_state != {}): False 12033 1726867211.75993: when evaluation is False, skipping this task 12033 1726867211.76002: _execute() done 12033 1726867211.76006: dumping result to json 12033 1726867211.76008: done dumping result, returning 12033 1726867211.76013: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-74bb-502b-000000000a3f] 12033 1726867211.76018: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a3f 12033 1726867211.76117: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a3f 12033 1726867211.76120: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867211.76173: no more pending results, returning what we have 12033 1726867211.76179: results queue empty 12033 1726867211.76180: checking for any_errors_fatal 12033 1726867211.76194: done checking for any_errors_fatal 12033 1726867211.76194: checking for max_fail_percentage 12033 1726867211.76196: done checking for max_fail_percentage 12033 1726867211.76197: checking to see if all hosts have failed and the running result is not ok 12033 1726867211.76198: done checking to see if all hosts have failed 12033 1726867211.76198: getting the remaining hosts for this loop 12033 1726867211.76203: done getting the remaining hosts for this loop 12033 1726867211.76206: getting the next task for host managed_node3 12033 1726867211.76212: done getting next task for host managed_node3 12033 1726867211.76215: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12033 1726867211.76219: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867211.76239: getting variables 12033 1726867211.76240: in VariableManager get_vars() 12033 1726867211.76272: Calling all_inventory to load vars for managed_node3 12033 1726867211.76275: Calling groups_inventory to load vars for managed_node3 12033 1726867211.76278: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867211.76286: Calling all_plugins_play to load vars for managed_node3 12033 1726867211.76288: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867211.76291: Calling groups_plugins_play to load vars for managed_node3 12033 1726867211.77050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867211.78245: done with get_vars() 12033 1726867211.78266: done getting variables 12033 1726867211.78327: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:20:11 -0400 (0:00:00.039) 0:00:50.899 ****** 12033 1726867211.78361: entering _queue_task() for managed_node3/debug 12033 1726867211.78683: worker is 1 (out of 1 available) 12033 1726867211.78698: exiting _queue_task() for managed_node3/debug 12033 1726867211.78712: done queuing things up, now waiting for results queue to drain 12033 1726867211.78715: waiting for pending results... 12033 1726867211.78916: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12033 1726867211.79030: in run() - task 0affcac9-a3a5-74bb-502b-000000000a40 12033 1726867211.79040: variable 'ansible_search_path' from source: unknown 12033 1726867211.79045: variable 'ansible_search_path' from source: unknown 12033 1726867211.79081: calling self._execute() 12033 1726867211.79145: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867211.79149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867211.79159: variable 'omit' from source: magic vars 12033 1726867211.79427: variable 'ansible_distribution_major_version' from source: facts 12033 1726867211.79437: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867211.79444: variable 'omit' from source: magic vars 12033 1726867211.79492: variable 'omit' from source: magic vars 12033 1726867211.79520: variable 'omit' from source: magic vars 12033 1726867211.79551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867211.79579: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867211.79595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867211.79613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867211.79622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867211.79647: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867211.79650: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867211.79653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867211.79726: Set connection var ansible_pipelining to False 12033 1726867211.79736: Set connection var ansible_shell_executable to /bin/sh 12033 1726867211.79740: Set connection var ansible_timeout to 10 12033 1726867211.79746: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867211.79748: Set connection var ansible_connection to ssh 12033 1726867211.79753: Set connection var ansible_shell_type to sh 12033 1726867211.79769: variable 'ansible_shell_executable' from source: unknown 12033 1726867211.79771: variable 'ansible_connection' from source: unknown 12033 1726867211.79774: variable 'ansible_module_compression' from source: unknown 12033 1726867211.79778: variable 'ansible_shell_type' from source: unknown 12033 1726867211.79781: variable 'ansible_shell_executable' from source: unknown 12033 1726867211.79785: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867211.79788: variable 'ansible_pipelining' from source: unknown 12033 1726867211.79791: variable 'ansible_timeout' from source: unknown 12033 1726867211.79795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867211.79897: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867211.79910: variable 'omit' from source: magic vars 12033 1726867211.79914: starting attempt loop 12033 1726867211.79916: running the handler 12033 1726867211.80010: variable '__network_connections_result' from source: set_fact 12033 1726867211.80063: handler run complete 12033 1726867211.80073: attempt loop complete, returning result 12033 1726867211.80076: _execute() done 12033 1726867211.80082: dumping result to json 12033 1726867211.80084: done dumping result, returning 12033 1726867211.80091: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-74bb-502b-000000000a40] 12033 1726867211.80096: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a40 12033 1726867211.80181: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a40 12033 1726867211.80184: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 560fcc66-c438-49a7-835e-aed531442b3e", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 67803b09-25b3-4aa2-bd80-0afe9d2728fb", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 3fbfe47f-1f56-4efb-93c9-a69913e19a91", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 560fcc66-c438-49a7-835e-aed531442b3e (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 67803b09-25b3-4aa2-bd80-0afe9d2728fb (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 3fbfe47f-1f56-4efb-93c9-a69913e19a91 (not-active)" ] } 12033 1726867211.80248: no more pending results, returning what we have 12033 1726867211.80252: results queue empty 12033 1726867211.80253: checking for any_errors_fatal 12033 1726867211.80260: done checking for any_errors_fatal 12033 1726867211.80261: checking for max_fail_percentage 12033 1726867211.80262: done checking for max_fail_percentage 12033 1726867211.80263: checking to see if all hosts have failed and the running result is not ok 12033 1726867211.80264: done checking to see if all hosts have failed 12033 1726867211.80264: getting the remaining hosts for this loop 12033 1726867211.80267: done getting the remaining hosts for this loop 12033 1726867211.80269: getting the next task for host managed_node3 12033 1726867211.80276: done getting next task for host managed_node3 12033 1726867211.80287: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12033 1726867211.80291: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867211.80302: getting variables 12033 1726867211.80304: in VariableManager get_vars() 12033 1726867211.80336: Calling all_inventory to load vars for managed_node3 12033 1726867211.80338: Calling groups_inventory to load vars for managed_node3 12033 1726867211.80340: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867211.80347: Calling all_plugins_play to load vars for managed_node3 12033 1726867211.80354: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867211.80357: Calling groups_plugins_play to load vars for managed_node3 12033 1726867211.84358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867211.85200: done with get_vars() 12033 1726867211.85216: done getting variables 12033 1726867211.85247: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:20:11 -0400 (0:00:00.069) 0:00:50.968 ****** 12033 1726867211.85268: entering _queue_task() for managed_node3/debug 12033 1726867211.85517: worker is 1 (out of 1 available) 12033 1726867211.85531: exiting _queue_task() for managed_node3/debug 12033 1726867211.85543: done queuing things up, now waiting for results queue to drain 12033 1726867211.85546: waiting for pending results... 12033 1726867211.85730: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12033 1726867211.85842: in run() - task 0affcac9-a3a5-74bb-502b-000000000a41 12033 1726867211.85853: variable 'ansible_search_path' from source: unknown 12033 1726867211.85858: variable 'ansible_search_path' from source: unknown 12033 1726867211.85891: calling self._execute() 12033 1726867211.85960: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867211.85964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867211.85973: variable 'omit' from source: magic vars 12033 1726867211.86257: variable 'ansible_distribution_major_version' from source: facts 12033 1726867211.86266: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867211.86273: variable 'omit' from source: magic vars 12033 1726867211.86326: variable 'omit' from source: magic vars 12033 1726867211.86347: variable 'omit' from source: magic vars 12033 1726867211.86380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867211.86409: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867211.86427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867211.86442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867211.86452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867211.86476: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867211.86481: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867211.86483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867211.86552: Set connection var ansible_pipelining to False 12033 1726867211.86559: Set connection var ansible_shell_executable to /bin/sh 12033 1726867211.86566: Set connection var ansible_timeout to 10 12033 1726867211.86571: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867211.86573: Set connection var ansible_connection to ssh 12033 1726867211.86580: Set connection var ansible_shell_type to sh 12033 1726867211.86596: variable 'ansible_shell_executable' from source: unknown 12033 1726867211.86599: variable 'ansible_connection' from source: unknown 12033 1726867211.86602: variable 'ansible_module_compression' from source: unknown 12033 1726867211.86606: variable 'ansible_shell_type' from source: unknown 12033 1726867211.86609: variable 'ansible_shell_executable' from source: unknown 12033 1726867211.86611: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867211.86615: variable 'ansible_pipelining' from source: unknown 12033 1726867211.86618: variable 'ansible_timeout' from source: unknown 12033 1726867211.86620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867211.86720: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867211.86729: variable 'omit' from source: magic vars 12033 1726867211.86734: starting attempt loop 12033 1726867211.86737: running the handler 12033 1726867211.86778: variable '__network_connections_result' from source: set_fact 12033 1726867211.86832: variable '__network_connections_result' from source: set_fact 12033 1726867211.86951: handler run complete 12033 1726867211.86971: attempt loop complete, returning result 12033 1726867211.86976: _execute() done 12033 1726867211.86980: dumping result to json 12033 1726867211.86982: done dumping result, returning 12033 1726867211.86993: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-74bb-502b-000000000a41] 12033 1726867211.86995: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a41 12033 1726867211.87102: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a41 12033 1726867211.87104: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 560fcc66-c438-49a7-835e-aed531442b3e\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 67803b09-25b3-4aa2-bd80-0afe9d2728fb\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 3fbfe47f-1f56-4efb-93c9-a69913e19a91\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 560fcc66-c438-49a7-835e-aed531442b3e (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 67803b09-25b3-4aa2-bd80-0afe9d2728fb (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 3fbfe47f-1f56-4efb-93c9-a69913e19a91 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 560fcc66-c438-49a7-835e-aed531442b3e", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 67803b09-25b3-4aa2-bd80-0afe9d2728fb", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 3fbfe47f-1f56-4efb-93c9-a69913e19a91", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 560fcc66-c438-49a7-835e-aed531442b3e (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 67803b09-25b3-4aa2-bd80-0afe9d2728fb (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 3fbfe47f-1f56-4efb-93c9-a69913e19a91 (not-active)" ] } } 12033 1726867211.87215: no more pending results, returning what we have 12033 1726867211.87218: results queue empty 12033 1726867211.87219: checking for any_errors_fatal 12033 1726867211.87225: done checking for any_errors_fatal 12033 1726867211.87226: checking for max_fail_percentage 12033 1726867211.87227: done checking for max_fail_percentage 12033 1726867211.87228: checking to see if all hosts have failed and the running result is not ok 12033 1726867211.87229: done checking to see if all hosts have failed 12033 1726867211.87229: getting the remaining hosts for this loop 12033 1726867211.87231: done getting the remaining hosts for this loop 12033 1726867211.87234: getting the next task for host managed_node3 12033 1726867211.87239: done getting next task for host managed_node3 12033 1726867211.87242: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12033 1726867211.87246: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867211.87256: getting variables 12033 1726867211.87257: in VariableManager get_vars() 12033 1726867211.87295: Calling all_inventory to load vars for managed_node3 12033 1726867211.87298: Calling groups_inventory to load vars for managed_node3 12033 1726867211.87300: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867211.87307: Calling all_plugins_play to load vars for managed_node3 12033 1726867211.87310: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867211.87312: Calling groups_plugins_play to load vars for managed_node3 12033 1726867211.88036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867211.88989: done with get_vars() 12033 1726867211.89005: done getting variables 12033 1726867211.89043: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:20:11 -0400 (0:00:00.037) 0:00:51.006 ****** 12033 1726867211.89068: entering _queue_task() for managed_node3/debug 12033 1726867211.89268: worker is 1 (out of 1 available) 12033 1726867211.89283: exiting _queue_task() for managed_node3/debug 12033 1726867211.89294: done queuing things up, now waiting for results queue to drain 12033 1726867211.89296: waiting for pending results... 12033 1726867211.89473: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12033 1726867211.89580: in run() - task 0affcac9-a3a5-74bb-502b-000000000a42 12033 1726867211.89592: variable 'ansible_search_path' from source: unknown 12033 1726867211.89595: variable 'ansible_search_path' from source: unknown 12033 1726867211.89624: calling self._execute() 12033 1726867211.89696: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867211.89699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867211.89708: variable 'omit' from source: magic vars 12033 1726867211.89971: variable 'ansible_distribution_major_version' from source: facts 12033 1726867211.89983: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867211.90065: variable 'network_state' from source: role '' defaults 12033 1726867211.90079: Evaluated conditional (network_state != {}): False 12033 1726867211.90082: when evaluation is False, skipping this task 12033 1726867211.90085: _execute() done 12033 1726867211.90087: dumping result to json 12033 1726867211.90090: done dumping result, returning 12033 1726867211.90093: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-74bb-502b-000000000a42] 12033 1726867211.90095: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a42 12033 1726867211.90183: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a42 12033 1726867211.90186: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 12033 1726867211.90232: no more pending results, returning what we have 12033 1726867211.90235: results queue empty 12033 1726867211.90236: checking for any_errors_fatal 12033 1726867211.90245: done checking for any_errors_fatal 12033 1726867211.90246: checking for max_fail_percentage 12033 1726867211.90248: done checking for max_fail_percentage 12033 1726867211.90248: checking to see if all hosts have failed and the running result is not ok 12033 1726867211.90249: done checking to see if all hosts have failed 12033 1726867211.90250: getting the remaining hosts for this loop 12033 1726867211.90252: done getting the remaining hosts for this loop 12033 1726867211.90254: getting the next task for host managed_node3 12033 1726867211.90261: done getting next task for host managed_node3 12033 1726867211.90264: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12033 1726867211.90269: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867211.90288: getting variables 12033 1726867211.90289: in VariableManager get_vars() 12033 1726867211.90324: Calling all_inventory to load vars for managed_node3 12033 1726867211.90327: Calling groups_inventory to load vars for managed_node3 12033 1726867211.90329: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867211.90337: Calling all_plugins_play to load vars for managed_node3 12033 1726867211.90339: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867211.90342: Calling groups_plugins_play to load vars for managed_node3 12033 1726867211.91056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867211.91915: done with get_vars() 12033 1726867211.91929: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:20:11 -0400 (0:00:00.029) 0:00:51.035 ****** 12033 1726867211.91997: entering _queue_task() for managed_node3/ping 12033 1726867211.92188: worker is 1 (out of 1 available) 12033 1726867211.92201: exiting _queue_task() for managed_node3/ping 12033 1726867211.92211: done queuing things up, now waiting for results queue to drain 12033 1726867211.92213: waiting for pending results... 12033 1726867211.92394: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 12033 1726867211.92504: in run() - task 0affcac9-a3a5-74bb-502b-000000000a43 12033 1726867211.92518: variable 'ansible_search_path' from source: unknown 12033 1726867211.92522: variable 'ansible_search_path' from source: unknown 12033 1726867211.92548: calling self._execute() 12033 1726867211.92623: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867211.92627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867211.92635: variable 'omit' from source: magic vars 12033 1726867211.92907: variable 'ansible_distribution_major_version' from source: facts 12033 1726867211.92915: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867211.92921: variable 'omit' from source: magic vars 12033 1726867211.92967: variable 'omit' from source: magic vars 12033 1726867211.92995: variable 'omit' from source: magic vars 12033 1726867211.93025: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867211.93052: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867211.93068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867211.93082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867211.93096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867211.93120: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867211.93124: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867211.93126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867211.93191: Set connection var ansible_pipelining to False 12033 1726867211.93197: Set connection var ansible_shell_executable to /bin/sh 12033 1726867211.93211: Set connection var ansible_timeout to 10 12033 1726867211.93218: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867211.93220: Set connection var ansible_connection to ssh 12033 1726867211.93225: Set connection var ansible_shell_type to sh 12033 1726867211.93242: variable 'ansible_shell_executable' from source: unknown 12033 1726867211.93246: variable 'ansible_connection' from source: unknown 12033 1726867211.93249: variable 'ansible_module_compression' from source: unknown 12033 1726867211.93251: variable 'ansible_shell_type' from source: unknown 12033 1726867211.93254: variable 'ansible_shell_executable' from source: unknown 12033 1726867211.93256: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867211.93258: variable 'ansible_pipelining' from source: unknown 12033 1726867211.93261: variable 'ansible_timeout' from source: unknown 12033 1726867211.93263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867211.93412: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867211.93422: variable 'omit' from source: magic vars 12033 1726867211.93426: starting attempt loop 12033 1726867211.93430: running the handler 12033 1726867211.93441: _low_level_execute_command(): starting 12033 1726867211.93447: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867211.93956: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867211.93962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867211.93965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867211.93967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867211.94021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867211.94025: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867211.94027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867211.94087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867211.95816: stdout chunk (state=3): >>>/root <<< 12033 1726867211.95896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867211.95928: stderr chunk (state=3): >>><<< 12033 1726867211.95931: stdout chunk (state=3): >>><<< 12033 1726867211.95952: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867211.95962: _low_level_execute_command(): starting 12033 1726867211.95968: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867211.9595027-14506-229210222582621 `" && echo ansible-tmp-1726867211.9595027-14506-229210222582621="` echo /root/.ansible/tmp/ansible-tmp-1726867211.9595027-14506-229210222582621 `" ) && sleep 0' 12033 1726867211.96405: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867211.96408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867211.96411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867211.96420: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867211.96422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867211.96465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867211.96470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867211.96519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867211.98437: stdout chunk (state=3): >>>ansible-tmp-1726867211.9595027-14506-229210222582621=/root/.ansible/tmp/ansible-tmp-1726867211.9595027-14506-229210222582621 <<< 12033 1726867211.98547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867211.98570: stderr chunk (state=3): >>><<< 12033 1726867211.98573: stdout chunk (state=3): >>><<< 12033 1726867211.98589: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867211.9595027-14506-229210222582621=/root/.ansible/tmp/ansible-tmp-1726867211.9595027-14506-229210222582621 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867211.98629: variable 'ansible_module_compression' from source: unknown 12033 1726867211.98661: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 12033 1726867211.98692: variable 'ansible_facts' from source: unknown 12033 1726867211.98746: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867211.9595027-14506-229210222582621/AnsiballZ_ping.py 12033 1726867211.98841: Sending initial data 12033 1726867211.98845: Sent initial data (153 bytes) 12033 1726867211.99288: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867211.99292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867211.99294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867211.99296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867211.99298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867211.99343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867211.99347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867211.99399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867212.00973: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 12033 1726867212.00980: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867212.01014: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867212.01059: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpbo85dq3q /root/.ansible/tmp/ansible-tmp-1726867211.9595027-14506-229210222582621/AnsiballZ_ping.py <<< 12033 1726867212.01064: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867211.9595027-14506-229210222582621/AnsiballZ_ping.py" <<< 12033 1726867212.01103: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpbo85dq3q" to remote "/root/.ansible/tmp/ansible-tmp-1726867211.9595027-14506-229210222582621/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867211.9595027-14506-229210222582621/AnsiballZ_ping.py" <<< 12033 1726867212.01614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867212.01651: stderr chunk (state=3): >>><<< 12033 1726867212.01654: stdout chunk (state=3): >>><<< 12033 1726867212.01678: done transferring module to remote 12033 1726867212.01686: _low_level_execute_command(): starting 12033 1726867212.01691: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867211.9595027-14506-229210222582621/ /root/.ansible/tmp/ansible-tmp-1726867211.9595027-14506-229210222582621/AnsiballZ_ping.py && sleep 0' 12033 1726867212.02116: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.02119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.02122: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867212.02124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867212.02129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.02169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867212.02172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867212.02220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867212.04258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867212.04280: stderr chunk (state=3): >>><<< 12033 1726867212.04284: stdout chunk (state=3): >>><<< 12033 1726867212.04295: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867212.04302: _low_level_execute_command(): starting 12033 1726867212.04305: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867211.9595027-14506-229210222582621/AnsiballZ_ping.py && sleep 0' 12033 1726867212.04709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.04712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.04714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867212.04716: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.04718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.04768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867212.04775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867212.04823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867212.20356: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12033 1726867212.21774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867212.21780: stdout chunk (state=3): >>><<< 12033 1726867212.21783: stderr chunk (state=3): >>><<< 12033 1726867212.21900: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867212.21905: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867211.9595027-14506-229210222582621/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867212.21908: _low_level_execute_command(): starting 12033 1726867212.21910: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867211.9595027-14506-229210222582621/ > /dev/null 2>&1 && sleep 0' 12033 1726867212.22437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867212.22453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867212.22468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.22569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867212.22600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867212.22670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867212.24569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867212.24599: stdout chunk (state=3): >>><<< 12033 1726867212.24611: stderr chunk (state=3): >>><<< 12033 1726867212.24683: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867212.24688: handler run complete 12033 1726867212.24691: attempt loop complete, returning result 12033 1726867212.24693: _execute() done 12033 1726867212.24696: dumping result to json 12033 1726867212.24698: done dumping result, returning 12033 1726867212.24703: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-74bb-502b-000000000a43] 12033 1726867212.24706: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a43 ok: [managed_node3] => { "changed": false, "ping": "pong" } 12033 1726867212.24886: no more pending results, returning what we have 12033 1726867212.24890: results queue empty 12033 1726867212.24891: checking for any_errors_fatal 12033 1726867212.24898: done checking for any_errors_fatal 12033 1726867212.24899: checking for max_fail_percentage 12033 1726867212.24903: done checking for max_fail_percentage 12033 1726867212.24904: checking to see if all hosts have failed and the running result is not ok 12033 1726867212.24905: done checking to see if all hosts have failed 12033 1726867212.24905: getting the remaining hosts for this loop 12033 1726867212.24907: done getting the remaining hosts for this loop 12033 1726867212.24910: getting the next task for host managed_node3 12033 1726867212.24921: done getting next task for host managed_node3 12033 1726867212.24923: ^ task is: TASK: meta (role_complete) 12033 1726867212.24928: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867212.24939: getting variables 12033 1726867212.24941: in VariableManager get_vars() 12033 1726867212.25203: Calling all_inventory to load vars for managed_node3 12033 1726867212.25206: Calling groups_inventory to load vars for managed_node3 12033 1726867212.25209: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867212.25220: Calling all_plugins_play to load vars for managed_node3 12033 1726867212.25222: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867212.25225: Calling groups_plugins_play to load vars for managed_node3 12033 1726867212.25816: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a43 12033 1726867212.25819: WORKER PROCESS EXITING 12033 1726867212.27019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867212.28704: done with get_vars() 12033 1726867212.28725: done getting variables 12033 1726867212.28815: done queuing things up, now waiting for results queue to drain 12033 1726867212.28817: results queue empty 12033 1726867212.28818: checking for any_errors_fatal 12033 1726867212.28821: done checking for any_errors_fatal 12033 1726867212.28821: checking for max_fail_percentage 12033 1726867212.28822: done checking for max_fail_percentage 12033 1726867212.28823: checking to see if all hosts have failed and the running result is not ok 12033 1726867212.28824: done checking to see if all hosts have failed 12033 1726867212.28825: getting the remaining hosts for this loop 12033 1726867212.28825: done getting the remaining hosts for this loop 12033 1726867212.28828: getting the next task for host managed_node3 12033 1726867212.28832: done getting next task for host managed_node3 12033 1726867212.28834: ^ task is: TASK: Show result 12033 1726867212.28837: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867212.28839: getting variables 12033 1726867212.28840: in VariableManager get_vars() 12033 1726867212.28855: Calling all_inventory to load vars for managed_node3 12033 1726867212.28857: Calling groups_inventory to load vars for managed_node3 12033 1726867212.28859: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867212.28864: Calling all_plugins_play to load vars for managed_node3 12033 1726867212.28866: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867212.28869: Calling groups_plugins_play to load vars for managed_node3 12033 1726867212.30094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867212.31838: done with get_vars() 12033 1726867212.31861: done getting variables 12033 1726867212.31912: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml:33 Friday 20 September 2024 17:20:12 -0400 (0:00:00.399) 0:00:51.435 ****** 12033 1726867212.31954: entering _queue_task() for managed_node3/debug 12033 1726867212.32375: worker is 1 (out of 1 available) 12033 1726867212.32390: exiting _queue_task() for managed_node3/debug 12033 1726867212.32407: done queuing things up, now waiting for results queue to drain 12033 1726867212.32408: waiting for pending results... 12033 1726867212.32708: running TaskExecutor() for managed_node3/TASK: Show result 12033 1726867212.32828: in run() - task 0affcac9-a3a5-74bb-502b-000000000a73 12033 1726867212.32851: variable 'ansible_search_path' from source: unknown 12033 1726867212.32858: variable 'ansible_search_path' from source: unknown 12033 1726867212.32901: calling self._execute() 12033 1726867212.33006: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867212.33021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867212.33035: variable 'omit' from source: magic vars 12033 1726867212.33396: variable 'ansible_distribution_major_version' from source: facts 12033 1726867212.33414: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867212.33424: variable 'omit' from source: magic vars 12033 1726867212.33446: variable 'omit' from source: magic vars 12033 1726867212.33485: variable 'omit' from source: magic vars 12033 1726867212.33527: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867212.33572: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867212.33599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867212.33622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867212.33783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867212.33786: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867212.33789: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867212.33791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867212.33793: Set connection var ansible_pipelining to False 12033 1726867212.33795: Set connection var ansible_shell_executable to /bin/sh 12033 1726867212.33812: Set connection var ansible_timeout to 10 12033 1726867212.33824: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867212.33833: Set connection var ansible_connection to ssh 12033 1726867212.33843: Set connection var ansible_shell_type to sh 12033 1726867212.33870: variable 'ansible_shell_executable' from source: unknown 12033 1726867212.33883: variable 'ansible_connection' from source: unknown 12033 1726867212.33892: variable 'ansible_module_compression' from source: unknown 12033 1726867212.33900: variable 'ansible_shell_type' from source: unknown 12033 1726867212.33913: variable 'ansible_shell_executable' from source: unknown 12033 1726867212.33921: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867212.33930: variable 'ansible_pipelining' from source: unknown 12033 1726867212.33937: variable 'ansible_timeout' from source: unknown 12033 1726867212.33944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867212.34086: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867212.34103: variable 'omit' from source: magic vars 12033 1726867212.34114: starting attempt loop 12033 1726867212.34123: running the handler 12033 1726867212.34173: variable '__network_connections_result' from source: set_fact 12033 1726867212.34254: variable '__network_connections_result' from source: set_fact 12033 1726867212.34430: handler run complete 12033 1726867212.34467: attempt loop complete, returning result 12033 1726867212.34474: _execute() done 12033 1726867212.34483: dumping result to json 12033 1726867212.34493: done dumping result, returning 12033 1726867212.34504: done running TaskExecutor() for managed_node3/TASK: Show result [0affcac9-a3a5-74bb-502b-000000000a73] 12033 1726867212.34562: sending task result for task 0affcac9-a3a5-74bb-502b-000000000a73 12033 1726867212.34640: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000a73 12033 1726867212.34643: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 560fcc66-c438-49a7-835e-aed531442b3e\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 67803b09-25b3-4aa2-bd80-0afe9d2728fb\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 3fbfe47f-1f56-4efb-93c9-a69913e19a91\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 560fcc66-c438-49a7-835e-aed531442b3e (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 67803b09-25b3-4aa2-bd80-0afe9d2728fb (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 3fbfe47f-1f56-4efb-93c9-a69913e19a91 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 560fcc66-c438-49a7-835e-aed531442b3e", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 67803b09-25b3-4aa2-bd80-0afe9d2728fb", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 3fbfe47f-1f56-4efb-93c9-a69913e19a91", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 560fcc66-c438-49a7-835e-aed531442b3e (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 67803b09-25b3-4aa2-bd80-0afe9d2728fb (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 3fbfe47f-1f56-4efb-93c9-a69913e19a91 (not-active)" ] } } 12033 1726867212.34757: no more pending results, returning what we have 12033 1726867212.34760: results queue empty 12033 1726867212.34768: checking for any_errors_fatal 12033 1726867212.34770: done checking for any_errors_fatal 12033 1726867212.34771: checking for max_fail_percentage 12033 1726867212.34773: done checking for max_fail_percentage 12033 1726867212.34774: checking to see if all hosts have failed and the running result is not ok 12033 1726867212.34775: done checking to see if all hosts have failed 12033 1726867212.34776: getting the remaining hosts for this loop 12033 1726867212.34780: done getting the remaining hosts for this loop 12033 1726867212.34783: getting the next task for host managed_node3 12033 1726867212.34792: done getting next task for host managed_node3 12033 1726867212.34795: ^ task is: TASK: Asserts 12033 1726867212.34799: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867212.34805: getting variables 12033 1726867212.34807: in VariableManager get_vars() 12033 1726867212.34850: Calling all_inventory to load vars for managed_node3 12033 1726867212.34853: Calling groups_inventory to load vars for managed_node3 12033 1726867212.34855: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867212.34867: Calling all_plugins_play to load vars for managed_node3 12033 1726867212.34870: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867212.34873: Calling groups_plugins_play to load vars for managed_node3 12033 1726867212.36704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867212.38239: done with get_vars() 12033 1726867212.38263: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 17:20:12 -0400 (0:00:00.063) 0:00:51.499 ****** 12033 1726867212.38356: entering _queue_task() for managed_node3/include_tasks 12033 1726867212.38702: worker is 1 (out of 1 available) 12033 1726867212.38714: exiting _queue_task() for managed_node3/include_tasks 12033 1726867212.38727: done queuing things up, now waiting for results queue to drain 12033 1726867212.38729: waiting for pending results... 12033 1726867212.39086: running TaskExecutor() for managed_node3/TASK: Asserts 12033 1726867212.39186: in run() - task 0affcac9-a3a5-74bb-502b-0000000008ef 12033 1726867212.39207: variable 'ansible_search_path' from source: unknown 12033 1726867212.39216: variable 'ansible_search_path' from source: unknown 12033 1726867212.39286: variable 'lsr_assert' from source: include params 12033 1726867212.39569: variable 'lsr_assert' from source: include params 12033 1726867212.39651: variable 'omit' from source: magic vars 12033 1726867212.39798: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867212.39814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867212.39830: variable 'omit' from source: magic vars 12033 1726867212.40067: variable 'ansible_distribution_major_version' from source: facts 12033 1726867212.40085: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867212.40095: variable 'item' from source: unknown 12033 1726867212.40157: variable 'item' from source: unknown 12033 1726867212.40200: variable 'item' from source: unknown 12033 1726867212.40284: variable 'item' from source: unknown 12033 1726867212.40430: dumping result to json 12033 1726867212.40433: done dumping result, returning 12033 1726867212.40440: done running TaskExecutor() for managed_node3/TASK: Asserts [0affcac9-a3a5-74bb-502b-0000000008ef] 12033 1726867212.40445: sending task result for task 0affcac9-a3a5-74bb-502b-0000000008ef 12033 1726867212.40535: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000008ef 12033 1726867212.40538: WORKER PROCESS EXITING 12033 1726867212.40562: no more pending results, returning what we have 12033 1726867212.40567: in VariableManager get_vars() 12033 1726867212.40630: Calling all_inventory to load vars for managed_node3 12033 1726867212.40633: Calling groups_inventory to load vars for managed_node3 12033 1726867212.40635: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867212.40648: Calling all_plugins_play to load vars for managed_node3 12033 1726867212.40650: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867212.40653: Calling groups_plugins_play to load vars for managed_node3 12033 1726867212.41454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867212.42595: done with get_vars() 12033 1726867212.42620: variable 'ansible_search_path' from source: unknown 12033 1726867212.42621: variable 'ansible_search_path' from source: unknown 12033 1726867212.42657: we have included files to process 12033 1726867212.42658: generating all_blocks data 12033 1726867212.42660: done generating all_blocks data 12033 1726867212.42665: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 12033 1726867212.42666: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 12033 1726867212.42668: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 12033 1726867212.42926: in VariableManager get_vars() 12033 1726867212.42959: done with get_vars() 12033 1726867212.43005: in VariableManager get_vars() 12033 1726867212.43028: done with get_vars() 12033 1726867212.43049: done processing included file 12033 1726867212.43051: iterating over new_blocks loaded from include file 12033 1726867212.43052: in VariableManager get_vars() 12033 1726867212.43072: done with get_vars() 12033 1726867212.43074: filtering new block on tags 12033 1726867212.43120: done filtering new block on tags 12033 1726867212.43123: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml for managed_node3 => (item=tasks/assert_bond_options.yml) 12033 1726867212.43128: extending task lists for all hosts with included blocks 12033 1726867212.45028: done extending task lists 12033 1726867212.45029: done processing included files 12033 1726867212.45029: results queue empty 12033 1726867212.45030: checking for any_errors_fatal 12033 1726867212.45033: done checking for any_errors_fatal 12033 1726867212.45033: checking for max_fail_percentage 12033 1726867212.45034: done checking for max_fail_percentage 12033 1726867212.45035: checking to see if all hosts have failed and the running result is not ok 12033 1726867212.45035: done checking to see if all hosts have failed 12033 1726867212.45035: getting the remaining hosts for this loop 12033 1726867212.45036: done getting the remaining hosts for this loop 12033 1726867212.45038: getting the next task for host managed_node3 12033 1726867212.45041: done getting next task for host managed_node3 12033 1726867212.45042: ^ task is: TASK: ** TEST check bond settings 12033 1726867212.45044: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867212.45045: getting variables 12033 1726867212.45046: in VariableManager get_vars() 12033 1726867212.45054: Calling all_inventory to load vars for managed_node3 12033 1726867212.45056: Calling groups_inventory to load vars for managed_node3 12033 1726867212.45057: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867212.45061: Calling all_plugins_play to load vars for managed_node3 12033 1726867212.45062: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867212.45064: Calling groups_plugins_play to load vars for managed_node3 12033 1726867212.45991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867212.47144: done with get_vars() 12033 1726867212.47160: done getting variables 12033 1726867212.47191: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check bond settings] ********************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Friday 20 September 2024 17:20:12 -0400 (0:00:00.088) 0:00:51.587 ****** 12033 1726867212.47212: entering _queue_task() for managed_node3/command 12033 1726867212.47450: worker is 1 (out of 1 available) 12033 1726867212.47463: exiting _queue_task() for managed_node3/command 12033 1726867212.47474: done queuing things up, now waiting for results queue to drain 12033 1726867212.47476: waiting for pending results... 12033 1726867212.47657: running TaskExecutor() for managed_node3/TASK: ** TEST check bond settings 12033 1726867212.47726: in run() - task 0affcac9-a3a5-74bb-502b-000000000c2a 12033 1726867212.47739: variable 'ansible_search_path' from source: unknown 12033 1726867212.47742: variable 'ansible_search_path' from source: unknown 12033 1726867212.47778: variable 'bond_options_to_assert' from source: set_fact 12033 1726867212.47946: variable 'bond_options_to_assert' from source: set_fact 12033 1726867212.48031: variable 'omit' from source: magic vars 12033 1726867212.48124: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867212.48134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867212.48144: variable 'omit' from source: magic vars 12033 1726867212.48372: variable 'ansible_distribution_major_version' from source: facts 12033 1726867212.48375: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867212.48380: variable 'omit' from source: magic vars 12033 1726867212.48424: variable 'omit' from source: magic vars 12033 1726867212.48688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867212.50743: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867212.50793: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867212.50824: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867212.50850: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867212.50869: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867212.50939: variable 'controller_device' from source: play vars 12033 1726867212.50942: variable 'bond_opt' from source: unknown 12033 1726867212.50959: variable 'omit' from source: magic vars 12033 1726867212.50981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867212.51008: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867212.51021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867212.51039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867212.51044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867212.51068: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867212.51071: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867212.51074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867212.51139: Set connection var ansible_pipelining to False 12033 1726867212.51146: Set connection var ansible_shell_executable to /bin/sh 12033 1726867212.51155: Set connection var ansible_timeout to 10 12033 1726867212.51161: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867212.51163: Set connection var ansible_connection to ssh 12033 1726867212.51168: Set connection var ansible_shell_type to sh 12033 1726867212.51184: variable 'ansible_shell_executable' from source: unknown 12033 1726867212.51186: variable 'ansible_connection' from source: unknown 12033 1726867212.51189: variable 'ansible_module_compression' from source: unknown 12033 1726867212.51191: variable 'ansible_shell_type' from source: unknown 12033 1726867212.51194: variable 'ansible_shell_executable' from source: unknown 12033 1726867212.51196: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867212.51202: variable 'ansible_pipelining' from source: unknown 12033 1726867212.51205: variable 'ansible_timeout' from source: unknown 12033 1726867212.51207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867212.51278: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867212.51286: variable 'omit' from source: magic vars 12033 1726867212.51291: starting attempt loop 12033 1726867212.51294: running the handler 12033 1726867212.51307: _low_level_execute_command(): starting 12033 1726867212.51312: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867212.51756: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.51789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867212.51792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.51795: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867212.51797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867212.51800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.51850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867212.51857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867212.51859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867212.51909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867212.53895: stdout chunk (state=3): >>>/root <<< 12033 1726867212.54005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867212.54021: stderr chunk (state=3): >>><<< 12033 1726867212.54024: stdout chunk (state=3): >>><<< 12033 1726867212.54041: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867212.54052: _low_level_execute_command(): starting 12033 1726867212.54055: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867212.5403945-14534-250671607164012 `" && echo ansible-tmp-1726867212.5403945-14534-250671607164012="` echo /root/.ansible/tmp/ansible-tmp-1726867212.5403945-14534-250671607164012 `" ) && sleep 0' 12033 1726867212.54453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.54457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.54459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867212.54462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.54508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867212.54511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867212.54560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867212.56485: stdout chunk (state=3): >>>ansible-tmp-1726867212.5403945-14534-250671607164012=/root/.ansible/tmp/ansible-tmp-1726867212.5403945-14534-250671607164012 <<< 12033 1726867212.56604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867212.56622: stderr chunk (state=3): >>><<< 12033 1726867212.56625: stdout chunk (state=3): >>><<< 12033 1726867212.56639: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867212.5403945-14534-250671607164012=/root/.ansible/tmp/ansible-tmp-1726867212.5403945-14534-250671607164012 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867212.56660: variable 'ansible_module_compression' from source: unknown 12033 1726867212.56696: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867212.56731: variable 'ansible_facts' from source: unknown 12033 1726867212.56779: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867212.5403945-14534-250671607164012/AnsiballZ_command.py 12033 1726867212.56896: Sending initial data 12033 1726867212.56899: Sent initial data (156 bytes) 12033 1726867212.57283: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.57317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867212.57320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867212.57322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.57324: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.57327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.57374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867212.57385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867212.57425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867212.59222: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 12033 1726867212.59226: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867212.59263: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867212.59310: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpaiosylw5 /root/.ansible/tmp/ansible-tmp-1726867212.5403945-14534-250671607164012/AnsiballZ_command.py <<< 12033 1726867212.59316: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867212.5403945-14534-250671607164012/AnsiballZ_command.py" <<< 12033 1726867212.59354: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpaiosylw5" to remote "/root/.ansible/tmp/ansible-tmp-1726867212.5403945-14534-250671607164012/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867212.5403945-14534-250671607164012/AnsiballZ_command.py" <<< 12033 1726867212.59898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867212.59933: stderr chunk (state=3): >>><<< 12033 1726867212.59936: stdout chunk (state=3): >>><<< 12033 1726867212.59961: done transferring module to remote 12033 1726867212.59970: _low_level_execute_command(): starting 12033 1726867212.59972: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867212.5403945-14534-250671607164012/ /root/.ansible/tmp/ansible-tmp-1726867212.5403945-14534-250671607164012/AnsiballZ_command.py && sleep 0' 12033 1726867212.60373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867212.60379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867212.60382: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867212.60384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.60433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867212.60436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867212.60487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867212.62257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867212.62280: stderr chunk (state=3): >>><<< 12033 1726867212.62283: stdout chunk (state=3): >>><<< 12033 1726867212.62295: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867212.62298: _low_level_execute_command(): starting 12033 1726867212.62301: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867212.5403945-14534-250671607164012/AnsiballZ_command.py && sleep 0' 12033 1726867212.62673: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.62679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.62691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.62739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867212.62755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867212.62804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867212.78805: stdout chunk (state=3): >>> {"changed": true, "stdout": "active-backup 1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-20 17:20:12.782297", "end": "2024-09-20 17:20:12.785539", "delta": "0:00:00.003242", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867212.80402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867212.80432: stderr chunk (state=3): >>><<< 12033 1726867212.80435: stdout chunk (state=3): >>><<< 12033 1726867212.80452: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "active-backup 1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-20 17:20:12.782297", "end": "2024-09-20 17:20:12.785539", "delta": "0:00:00.003242", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867212.80483: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/mode', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867212.5403945-14534-250671607164012/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867212.80492: _low_level_execute_command(): starting 12033 1726867212.80495: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867212.5403945-14534-250671607164012/ > /dev/null 2>&1 && sleep 0' 12033 1726867212.80945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867212.80950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867212.80952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.80954: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.80956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867212.80958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.81009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867212.81017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867212.81020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867212.81061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867212.82927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867212.82953: stderr chunk (state=3): >>><<< 12033 1726867212.82956: stdout chunk (state=3): >>><<< 12033 1726867212.82968: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867212.82974: handler run complete 12033 1726867212.82995: Evaluated conditional (False): False 12033 1726867212.83105: variable 'bond_opt' from source: unknown 12033 1726867212.83108: variable 'result' from source: set_fact 12033 1726867212.83120: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867212.83129: attempt loop complete, returning result 12033 1726867212.83142: variable 'bond_opt' from source: unknown 12033 1726867212.83194: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'mode', 'value': 'active-backup'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "mode", "value": "active-backup" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/mode" ], "delta": "0:00:00.003242", "end": "2024-09-20 17:20:12.785539", "rc": 0, "start": "2024-09-20 17:20:12.782297" } STDOUT: active-backup 1 12033 1726867212.83387: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867212.83391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867212.83393: variable 'omit' from source: magic vars 12033 1726867212.83446: variable 'ansible_distribution_major_version' from source: facts 12033 1726867212.83449: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867212.83453: variable 'omit' from source: magic vars 12033 1726867212.83465: variable 'omit' from source: magic vars 12033 1726867212.83578: variable 'controller_device' from source: play vars 12033 1726867212.83583: variable 'bond_opt' from source: unknown 12033 1726867212.83596: variable 'omit' from source: magic vars 12033 1726867212.83621: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867212.83624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867212.83627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867212.83637: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867212.83640: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867212.83642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867212.83692: Set connection var ansible_pipelining to False 12033 1726867212.83698: Set connection var ansible_shell_executable to /bin/sh 12033 1726867212.83705: Set connection var ansible_timeout to 10 12033 1726867212.83710: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867212.83713: Set connection var ansible_connection to ssh 12033 1726867212.83717: Set connection var ansible_shell_type to sh 12033 1726867212.83735: variable 'ansible_shell_executable' from source: unknown 12033 1726867212.83738: variable 'ansible_connection' from source: unknown 12033 1726867212.83740: variable 'ansible_module_compression' from source: unknown 12033 1726867212.83742: variable 'ansible_shell_type' from source: unknown 12033 1726867212.83745: variable 'ansible_shell_executable' from source: unknown 12033 1726867212.83747: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867212.83749: variable 'ansible_pipelining' from source: unknown 12033 1726867212.83752: variable 'ansible_timeout' from source: unknown 12033 1726867212.83756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867212.83822: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867212.83833: variable 'omit' from source: magic vars 12033 1726867212.83836: starting attempt loop 12033 1726867212.83838: running the handler 12033 1726867212.83846: _low_level_execute_command(): starting 12033 1726867212.83848: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867212.84254: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.84296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867212.84299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867212.84305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867212.84308: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.84310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.84347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867212.84350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867212.84406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867212.86025: stdout chunk (state=3): >>>/root <<< 12033 1726867212.86126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867212.86148: stderr chunk (state=3): >>><<< 12033 1726867212.86151: stdout chunk (state=3): >>><<< 12033 1726867212.86162: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867212.86170: _low_level_execute_command(): starting 12033 1726867212.86174: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867212.8616161-14534-2472379313655 `" && echo ansible-tmp-1726867212.8616161-14534-2472379313655="` echo /root/.ansible/tmp/ansible-tmp-1726867212.8616161-14534-2472379313655 `" ) && sleep 0' 12033 1726867212.86555: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.86564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.86574: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.86587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.86627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867212.86641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867212.86690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867212.88589: stdout chunk (state=3): >>>ansible-tmp-1726867212.8616161-14534-2472379313655=/root/.ansible/tmp/ansible-tmp-1726867212.8616161-14534-2472379313655 <<< 12033 1726867212.88697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867212.88723: stderr chunk (state=3): >>><<< 12033 1726867212.88726: stdout chunk (state=3): >>><<< 12033 1726867212.88737: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867212.8616161-14534-2472379313655=/root/.ansible/tmp/ansible-tmp-1726867212.8616161-14534-2472379313655 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867212.88753: variable 'ansible_module_compression' from source: unknown 12033 1726867212.88781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867212.88796: variable 'ansible_facts' from source: unknown 12033 1726867212.88842: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867212.8616161-14534-2472379313655/AnsiballZ_command.py 12033 1726867212.88931: Sending initial data 12033 1726867212.88934: Sent initial data (154 bytes) 12033 1726867212.89342: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.89345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.89348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867212.89350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867212.89351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.89389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867212.89406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867212.89449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867212.91036: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 12033 1726867212.91043: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867212.91085: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867212.91126: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpsgpx0d67 /root/.ansible/tmp/ansible-tmp-1726867212.8616161-14534-2472379313655/AnsiballZ_command.py <<< 12033 1726867212.91133: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867212.8616161-14534-2472379313655/AnsiballZ_command.py" <<< 12033 1726867212.91169: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpsgpx0d67" to remote "/root/.ansible/tmp/ansible-tmp-1726867212.8616161-14534-2472379313655/AnsiballZ_command.py" <<< 12033 1726867212.91176: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867212.8616161-14534-2472379313655/AnsiballZ_command.py" <<< 12033 1726867212.91717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867212.91750: stderr chunk (state=3): >>><<< 12033 1726867212.91754: stdout chunk (state=3): >>><<< 12033 1726867212.91774: done transferring module to remote 12033 1726867212.91783: _low_level_execute_command(): starting 12033 1726867212.91786: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867212.8616161-14534-2472379313655/ /root/.ansible/tmp/ansible-tmp-1726867212.8616161-14534-2472379313655/AnsiballZ_command.py && sleep 0' 12033 1726867212.92166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.92197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867212.92200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.92202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.92250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867212.92254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867212.92306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867212.94131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867212.94149: stderr chunk (state=3): >>><<< 12033 1726867212.94152: stdout chunk (state=3): >>><<< 12033 1726867212.94163: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867212.94166: _low_level_execute_command(): starting 12033 1726867212.94171: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867212.8616161-14534-2472379313655/AnsiballZ_command.py && sleep 0' 12033 1726867212.94544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.94582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867212.94585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867212.94587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.94589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867212.94591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867212.94633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867212.94636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867212.94694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.10806: stdout chunk (state=3): >>> {"changed": true, "stdout": "60", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_interval"], "start": "2024-09-20 17:20:13.102440", "end": "2024-09-20 17:20:13.105563", "delta": "0:00:00.003123", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867213.12418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867213.12423: stdout chunk (state=3): >>><<< 12033 1726867213.12425: stderr chunk (state=3): >>><<< 12033 1726867213.12484: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "60", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_interval"], "start": "2024-09-20 17:20:13.102440", "end": "2024-09-20 17:20:13.105563", "delta": "0:00:00.003123", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867213.12493: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_interval', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867212.8616161-14534-2472379313655/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867213.12513: _low_level_execute_command(): starting 12033 1726867213.12523: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867212.8616161-14534-2472379313655/ > /dev/null 2>&1 && sleep 0' 12033 1726867213.13208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867213.13225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867213.13242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867213.13263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867213.13283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867213.13294: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867213.13352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.13409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867213.13428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867213.13458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.13538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.15484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867213.15488: stdout chunk (state=3): >>><<< 12033 1726867213.15490: stderr chunk (state=3): >>><<< 12033 1726867213.15579: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867213.15583: handler run complete 12033 1726867213.15585: Evaluated conditional (False): False 12033 1726867213.15907: variable 'bond_opt' from source: unknown 12033 1726867213.15910: variable 'result' from source: set_fact 12033 1726867213.15925: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867213.15937: attempt loop complete, returning result 12033 1726867213.15957: variable 'bond_opt' from source: unknown 12033 1726867213.16026: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'arp_interval', 'value': '60'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_interval", "value": "60" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_interval" ], "delta": "0:00:00.003123", "end": "2024-09-20 17:20:13.105563", "rc": 0, "start": "2024-09-20 17:20:13.102440" } STDOUT: 60 12033 1726867213.16584: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867213.16587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867213.16589: variable 'omit' from source: magic vars 12033 1726867213.16859: variable 'ansible_distribution_major_version' from source: facts 12033 1726867213.16868: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867213.16871: variable 'omit' from source: magic vars 12033 1726867213.16887: variable 'omit' from source: magic vars 12033 1726867213.17246: variable 'controller_device' from source: play vars 12033 1726867213.17249: variable 'bond_opt' from source: unknown 12033 1726867213.17343: variable 'omit' from source: magic vars 12033 1726867213.17396: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867213.17406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867213.17409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867213.17423: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867213.17426: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867213.17428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867213.17709: Set connection var ansible_pipelining to False 12033 1726867213.17716: Set connection var ansible_shell_executable to /bin/sh 12033 1726867213.17724: Set connection var ansible_timeout to 10 12033 1726867213.17729: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867213.17732: Set connection var ansible_connection to ssh 12033 1726867213.17737: Set connection var ansible_shell_type to sh 12033 1726867213.17755: variable 'ansible_shell_executable' from source: unknown 12033 1726867213.17758: variable 'ansible_connection' from source: unknown 12033 1726867213.17761: variable 'ansible_module_compression' from source: unknown 12033 1726867213.17763: variable 'ansible_shell_type' from source: unknown 12033 1726867213.17765: variable 'ansible_shell_executable' from source: unknown 12033 1726867213.17767: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867213.17887: variable 'ansible_pipelining' from source: unknown 12033 1726867213.17890: variable 'ansible_timeout' from source: unknown 12033 1726867213.17892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867213.17894: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867213.17897: variable 'omit' from source: magic vars 12033 1726867213.17898: starting attempt loop 12033 1726867213.17903: running the handler 12033 1726867213.17906: _low_level_execute_command(): starting 12033 1726867213.17993: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867213.19194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867213.19332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867213.19408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.19540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.21030: stdout chunk (state=3): >>>/root <<< 12033 1726867213.21184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867213.21189: stdout chunk (state=3): >>><<< 12033 1726867213.21198: stderr chunk (state=3): >>><<< 12033 1726867213.21214: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867213.21223: _low_level_execute_command(): starting 12033 1726867213.21228: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867213.212135-14534-191117196429193 `" && echo ansible-tmp-1726867213.212135-14534-191117196429193="` echo /root/.ansible/tmp/ansible-tmp-1726867213.212135-14534-191117196429193 `" ) && sleep 0' 12033 1726867213.21845: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867213.21854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867213.21909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867213.21912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867213.21915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867213.21999: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867213.22040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.22117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.24004: stdout chunk (state=3): >>>ansible-tmp-1726867213.212135-14534-191117196429193=/root/.ansible/tmp/ansible-tmp-1726867213.212135-14534-191117196429193 <<< 12033 1726867213.24160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867213.24163: stdout chunk (state=3): >>><<< 12033 1726867213.24165: stderr chunk (state=3): >>><<< 12033 1726867213.24183: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867213.212135-14534-191117196429193=/root/.ansible/tmp/ansible-tmp-1726867213.212135-14534-191117196429193 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867213.24382: variable 'ansible_module_compression' from source: unknown 12033 1726867213.24385: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867213.24388: variable 'ansible_facts' from source: unknown 12033 1726867213.24390: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867213.212135-14534-191117196429193/AnsiballZ_command.py 12033 1726867213.24526: Sending initial data 12033 1726867213.24528: Sent initial data (155 bytes) 12033 1726867213.25172: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.25201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867213.25224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867213.25243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.25324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.26986: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867213.27065: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867213.27108: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpsa_9rtbu /root/.ansible/tmp/ansible-tmp-1726867213.212135-14534-191117196429193/AnsiballZ_command.py <<< 12033 1726867213.27122: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867213.212135-14534-191117196429193/AnsiballZ_command.py" <<< 12033 1726867213.27156: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpsa_9rtbu" to remote "/root/.ansible/tmp/ansible-tmp-1726867213.212135-14534-191117196429193/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867213.212135-14534-191117196429193/AnsiballZ_command.py" <<< 12033 1726867213.27706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867213.27743: stderr chunk (state=3): >>><<< 12033 1726867213.27746: stdout chunk (state=3): >>><<< 12033 1726867213.27792: done transferring module to remote 12033 1726867213.27799: _low_level_execute_command(): starting 12033 1726867213.27804: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867213.212135-14534-191117196429193/ /root/.ansible/tmp/ansible-tmp-1726867213.212135-14534-191117196429193/AnsiballZ_command.py && sleep 0' 12033 1726867213.28242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867213.28245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867213.28247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.28249: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867213.28251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.28295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867213.28298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.28351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.30086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867213.30111: stderr chunk (state=3): >>><<< 12033 1726867213.30114: stdout chunk (state=3): >>><<< 12033 1726867213.30127: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867213.30130: _low_level_execute_command(): starting 12033 1726867213.30135: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867213.212135-14534-191117196429193/AnsiballZ_command.py && sleep 0' 12033 1726867213.30546: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867213.30549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.30555: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867213.30558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.30600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867213.30603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.30661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.46371: stdout chunk (state=3): >>> {"changed": true, "stdout": "192.0.2.128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_ip_target"], "start": "2024-09-20 17:20:13.458199", "end": "2024-09-20 17:20:13.461414", "delta": "0:00:00.003215", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_ip_target", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867213.47849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867213.47875: stderr chunk (state=3): >>><<< 12033 1726867213.47881: stdout chunk (state=3): >>><<< 12033 1726867213.47895: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "192.0.2.128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_ip_target"], "start": "2024-09-20 17:20:13.458199", "end": "2024-09-20 17:20:13.461414", "delta": "0:00:00.003215", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_ip_target", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867213.47917: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_ip_target', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867213.212135-14534-191117196429193/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867213.47921: _low_level_execute_command(): starting 12033 1726867213.47926: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867213.212135-14534-191117196429193/ > /dev/null 2>&1 && sleep 0' 12033 1726867213.48488: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867213.48541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.48591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.50411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867213.50435: stderr chunk (state=3): >>><<< 12033 1726867213.50438: stdout chunk (state=3): >>><<< 12033 1726867213.50456: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867213.50459: handler run complete 12033 1726867213.50474: Evaluated conditional (False): False 12033 1726867213.50583: variable 'bond_opt' from source: unknown 12033 1726867213.50589: variable 'result' from source: set_fact 12033 1726867213.50600: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867213.50612: attempt loop complete, returning result 12033 1726867213.50626: variable 'bond_opt' from source: unknown 12033 1726867213.50675: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'arp_ip_target', 'value': '192.0.2.128'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_ip_target", "value": "192.0.2.128" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_ip_target" ], "delta": "0:00:00.003215", "end": "2024-09-20 17:20:13.461414", "rc": 0, "start": "2024-09-20 17:20:13.458199" } STDOUT: 192.0.2.128 12033 1726867213.50804: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867213.50807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867213.50810: variable 'omit' from source: magic vars 12033 1726867213.50931: variable 'ansible_distribution_major_version' from source: facts 12033 1726867213.50939: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867213.50941: variable 'omit' from source: magic vars 12033 1726867213.50944: variable 'omit' from source: magic vars 12033 1726867213.51078: variable 'controller_device' from source: play vars 12033 1726867213.51082: variable 'bond_opt' from source: unknown 12033 1726867213.51108: variable 'omit' from source: magic vars 12033 1726867213.51124: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867213.51132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867213.51140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867213.51201: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867213.51204: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867213.51207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867213.51382: Set connection var ansible_pipelining to False 12033 1726867213.51385: Set connection var ansible_shell_executable to /bin/sh 12033 1726867213.51388: Set connection var ansible_timeout to 10 12033 1726867213.51390: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867213.51392: Set connection var ansible_connection to ssh 12033 1726867213.51394: Set connection var ansible_shell_type to sh 12033 1726867213.51396: variable 'ansible_shell_executable' from source: unknown 12033 1726867213.51398: variable 'ansible_connection' from source: unknown 12033 1726867213.51400: variable 'ansible_module_compression' from source: unknown 12033 1726867213.51402: variable 'ansible_shell_type' from source: unknown 12033 1726867213.51404: variable 'ansible_shell_executable' from source: unknown 12033 1726867213.51406: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867213.51408: variable 'ansible_pipelining' from source: unknown 12033 1726867213.51410: variable 'ansible_timeout' from source: unknown 12033 1726867213.51412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867213.51414: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867213.51416: variable 'omit' from source: magic vars 12033 1726867213.51418: starting attempt loop 12033 1726867213.51420: running the handler 12033 1726867213.51421: _low_level_execute_command(): starting 12033 1726867213.51423: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867213.52016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867213.52020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867213.52022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867213.52036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867213.52049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867213.52083: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867213.52092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.52098: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867213.52101: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867213.52103: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867213.52182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867213.52187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867213.52190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867213.52192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867213.52194: stderr chunk (state=3): >>>debug2: match found <<< 12033 1726867213.52197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.52234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867213.52237: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867213.52257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.52327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.53937: stdout chunk (state=3): >>>/root <<< 12033 1726867213.54029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867213.54065: stderr chunk (state=3): >>><<< 12033 1726867213.54074: stdout chunk (state=3): >>><<< 12033 1726867213.54093: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867213.54103: _low_level_execute_command(): starting 12033 1726867213.54106: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867213.5409222-14534-163291879692695 `" && echo ansible-tmp-1726867213.5409222-14534-163291879692695="` echo /root/.ansible/tmp/ansible-tmp-1726867213.5409222-14534-163291879692695 `" ) && sleep 0' 12033 1726867213.54523: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867213.54526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.54528: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867213.54534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867213.54536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.54572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867213.54575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.54632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.56526: stdout chunk (state=3): >>>ansible-tmp-1726867213.5409222-14534-163291879692695=/root/.ansible/tmp/ansible-tmp-1726867213.5409222-14534-163291879692695 <<< 12033 1726867213.56676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867213.56683: stdout chunk (state=3): >>><<< 12033 1726867213.56685: stderr chunk (state=3): >>><<< 12033 1726867213.56702: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867213.5409222-14534-163291879692695=/root/.ansible/tmp/ansible-tmp-1726867213.5409222-14534-163291879692695 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867213.56728: variable 'ansible_module_compression' from source: unknown 12033 1726867213.56786: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867213.56797: variable 'ansible_facts' from source: unknown 12033 1726867213.56872: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867213.5409222-14534-163291879692695/AnsiballZ_command.py 12033 1726867213.57092: Sending initial data 12033 1726867213.57095: Sent initial data (156 bytes) 12033 1726867213.57696: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867213.57710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867213.57730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.57805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.59333: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867213.59392: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867213.59463: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmprjgtt0gn /root/.ansible/tmp/ansible-tmp-1726867213.5409222-14534-163291879692695/AnsiballZ_command.py <<< 12033 1726867213.59479: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867213.5409222-14534-163291879692695/AnsiballZ_command.py" <<< 12033 1726867213.59501: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmprjgtt0gn" to remote "/root/.ansible/tmp/ansible-tmp-1726867213.5409222-14534-163291879692695/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867213.5409222-14534-163291879692695/AnsiballZ_command.py" <<< 12033 1726867213.60230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867213.60276: stderr chunk (state=3): >>><<< 12033 1726867213.60329: stdout chunk (state=3): >>><<< 12033 1726867213.60332: done transferring module to remote 12033 1726867213.60340: _low_level_execute_command(): starting 12033 1726867213.60348: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867213.5409222-14534-163291879692695/ /root/.ansible/tmp/ansible-tmp-1726867213.5409222-14534-163291879692695/AnsiballZ_command.py && sleep 0' 12033 1726867213.60997: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.61044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867213.61058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867213.61075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.61153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.63012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867213.63015: stdout chunk (state=3): >>><<< 12033 1726867213.63017: stderr chunk (state=3): >>><<< 12033 1726867213.63101: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867213.63104: _low_level_execute_command(): starting 12033 1726867213.63107: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867213.5409222-14534-163291879692695/AnsiballZ_command.py && sleep 0' 12033 1726867213.63554: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867213.63566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867213.63579: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.63624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867213.63648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.63691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.79488: stdout chunk (state=3): >>> {"changed": true, "stdout": "none 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_validate"], "start": "2024-09-20 17:20:13.789436", "end": "2024-09-20 17:20:13.792488", "delta": "0:00:00.003052", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_validate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867213.80955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867213.80985: stderr chunk (state=3): >>><<< 12033 1726867213.80988: stdout chunk (state=3): >>><<< 12033 1726867213.81006: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "none 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_validate"], "start": "2024-09-20 17:20:13.789436", "end": "2024-09-20 17:20:13.792488", "delta": "0:00:00.003052", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_validate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867213.81026: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_validate', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867213.5409222-14534-163291879692695/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867213.81030: _low_level_execute_command(): starting 12033 1726867213.81037: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867213.5409222-14534-163291879692695/ > /dev/null 2>&1 && sleep 0' 12033 1726867213.81504: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867213.81507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.81510: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867213.81514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867213.81520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.81563: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867213.81571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867213.81573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.81620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.83574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867213.83600: stderr chunk (state=3): >>><<< 12033 1726867213.83606: stdout chunk (state=3): >>><<< 12033 1726867213.83615: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867213.83620: handler run complete 12033 1726867213.83635: Evaluated conditional (False): False 12033 1726867213.83741: variable 'bond_opt' from source: unknown 12033 1726867213.83746: variable 'result' from source: set_fact 12033 1726867213.83759: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867213.83768: attempt loop complete, returning result 12033 1726867213.83786: variable 'bond_opt' from source: unknown 12033 1726867213.83836: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'arp_validate', 'value': 'none'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_validate", "value": "none" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_validate" ], "delta": "0:00:00.003052", "end": "2024-09-20 17:20:13.792488", "rc": 0, "start": "2024-09-20 17:20:13.789436" } STDOUT: none 0 12033 1726867213.83962: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867213.83965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867213.83970: variable 'omit' from source: magic vars 12033 1726867213.84061: variable 'ansible_distribution_major_version' from source: facts 12033 1726867213.84064: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867213.84069: variable 'omit' from source: magic vars 12033 1726867213.84087: variable 'omit' from source: magic vars 12033 1726867213.84197: variable 'controller_device' from source: play vars 12033 1726867213.84205: variable 'bond_opt' from source: unknown 12033 1726867213.84218: variable 'omit' from source: magic vars 12033 1726867213.84235: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867213.84242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867213.84248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867213.84258: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867213.84261: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867213.84264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867213.84317: Set connection var ansible_pipelining to False 12033 1726867213.84324: Set connection var ansible_shell_executable to /bin/sh 12033 1726867213.84331: Set connection var ansible_timeout to 10 12033 1726867213.84335: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867213.84338: Set connection var ansible_connection to ssh 12033 1726867213.84342: Set connection var ansible_shell_type to sh 12033 1726867213.84357: variable 'ansible_shell_executable' from source: unknown 12033 1726867213.84360: variable 'ansible_connection' from source: unknown 12033 1726867213.84362: variable 'ansible_module_compression' from source: unknown 12033 1726867213.84364: variable 'ansible_shell_type' from source: unknown 12033 1726867213.84366: variable 'ansible_shell_executable' from source: unknown 12033 1726867213.84369: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867213.84373: variable 'ansible_pipelining' from source: unknown 12033 1726867213.84375: variable 'ansible_timeout' from source: unknown 12033 1726867213.84381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867213.84447: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867213.84454: variable 'omit' from source: magic vars 12033 1726867213.84457: starting attempt loop 12033 1726867213.84460: running the handler 12033 1726867213.84466: _low_level_execute_command(): starting 12033 1726867213.84468: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867213.84875: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867213.84881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867213.84913: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867213.84916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867213.84919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867213.84921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.84964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867213.84969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867213.84982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.85032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.86608: stdout chunk (state=3): >>>/root <<< 12033 1726867213.86715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867213.86735: stderr chunk (state=3): >>><<< 12033 1726867213.86738: stdout chunk (state=3): >>><<< 12033 1726867213.86749: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867213.86756: _low_level_execute_command(): starting 12033 1726867213.86761: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867213.8674858-14534-16134736425315 `" && echo ansible-tmp-1726867213.8674858-14534-16134736425315="` echo /root/.ansible/tmp/ansible-tmp-1726867213.8674858-14534-16134736425315 `" ) && sleep 0' 12033 1726867213.87153: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867213.87156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867213.87182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867213.87186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867213.87188: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867213.87198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.87247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867213.87251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.87309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.89248: stdout chunk (state=3): >>>ansible-tmp-1726867213.8674858-14534-16134736425315=/root/.ansible/tmp/ansible-tmp-1726867213.8674858-14534-16134736425315 <<< 12033 1726867213.89353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867213.89376: stderr chunk (state=3): >>><<< 12033 1726867213.89382: stdout chunk (state=3): >>><<< 12033 1726867213.89394: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867213.8674858-14534-16134736425315=/root/.ansible/tmp/ansible-tmp-1726867213.8674858-14534-16134736425315 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867213.89410: variable 'ansible_module_compression' from source: unknown 12033 1726867213.89436: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867213.89451: variable 'ansible_facts' from source: unknown 12033 1726867213.89498: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867213.8674858-14534-16134736425315/AnsiballZ_command.py 12033 1726867213.89576: Sending initial data 12033 1726867213.89582: Sent initial data (155 bytes) 12033 1726867213.90011: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867213.90014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867213.90016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.90019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867213.90023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.90068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867213.90071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.90119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.91654: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 12033 1726867213.91660: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867213.91692: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867213.91736: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpg1ak4el4 /root/.ansible/tmp/ansible-tmp-1726867213.8674858-14534-16134736425315/AnsiballZ_command.py <<< 12033 1726867213.91744: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867213.8674858-14534-16134736425315/AnsiballZ_command.py" <<< 12033 1726867213.91782: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpg1ak4el4" to remote "/root/.ansible/tmp/ansible-tmp-1726867213.8674858-14534-16134736425315/AnsiballZ_command.py" <<< 12033 1726867213.91788: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867213.8674858-14534-16134736425315/AnsiballZ_command.py" <<< 12033 1726867213.92476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867213.92610: stdout chunk (state=3): >>><<< 12033 1726867213.92613: stderr chunk (state=3): >>><<< 12033 1726867213.92615: done transferring module to remote 12033 1726867213.92617: _low_level_execute_command(): starting 12033 1726867213.92619: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867213.8674858-14534-16134736425315/ /root/.ansible/tmp/ansible-tmp-1726867213.8674858-14534-16134736425315/AnsiballZ_command.py && sleep 0' 12033 1726867213.93087: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867213.93100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867213.93112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.93167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867213.93175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867213.93183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.93216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867213.95283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867213.95291: stdout chunk (state=3): >>><<< 12033 1726867213.95297: stderr chunk (state=3): >>><<< 12033 1726867213.95303: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867213.95306: _low_level_execute_command(): starting 12033 1726867213.95309: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867213.8674858-14534-16134736425315/AnsiballZ_command.py && sleep 0' 12033 1726867213.95786: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867213.95795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867213.95806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867213.95819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867213.95837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867213.95840: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867213.95847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.95947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867213.95950: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867213.95952: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867213.95954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867213.95956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867213.95958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867213.95960: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867213.95962: stderr chunk (state=3): >>>debug2: match found <<< 12033 1726867213.95964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867213.95979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867213.95997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867213.96014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867213.96086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867214.12032: stdout chunk (state=3): >>> {"changed": true, "stdout": "test1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/primary"], "start": "2024-09-20 17:20:14.114903", "end": "2024-09-20 17:20:14.117981", "delta": "0:00:00.003078", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/primary", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867214.13589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867214.13593: stdout chunk (state=3): >>><<< 12033 1726867214.13596: stderr chunk (state=3): >>><<< 12033 1726867214.13625: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "test1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/primary"], "start": "2024-09-20 17:20:14.114903", "end": "2024-09-20 17:20:14.117981", "delta": "0:00:00.003078", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/primary", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867214.13655: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/primary', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867213.8674858-14534-16134736425315/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867214.13659: _low_level_execute_command(): starting 12033 1726867214.13661: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867213.8674858-14534-16134736425315/ > /dev/null 2>&1 && sleep 0' 12033 1726867214.14280: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867214.14385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867214.14392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867214.14416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867214.14433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867214.14522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867214.16352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867214.16356: stdout chunk (state=3): >>><<< 12033 1726867214.16362: stderr chunk (state=3): >>><<< 12033 1726867214.16384: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867214.16389: handler run complete 12033 1726867214.16414: Evaluated conditional (False): False 12033 1726867214.16570: variable 'bond_opt' from source: unknown 12033 1726867214.16576: variable 'result' from source: set_fact 12033 1726867214.16596: Evaluated conditional (bond_opt.value in result.stdout): True 12033 1726867214.16610: attempt loop complete, returning result 12033 1726867214.16782: variable 'bond_opt' from source: unknown 12033 1726867214.16785: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'primary', 'value': 'test1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "primary", "value": "test1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/primary" ], "delta": "0:00:00.003078", "end": "2024-09-20 17:20:14.117981", "rc": 0, "start": "2024-09-20 17:20:14.114903" } STDOUT: test1 12033 1726867214.16882: dumping result to json 12033 1726867214.16886: done dumping result, returning 12033 1726867214.16888: done running TaskExecutor() for managed_node3/TASK: ** TEST check bond settings [0affcac9-a3a5-74bb-502b-000000000c2a] 12033 1726867214.16891: sending task result for task 0affcac9-a3a5-74bb-502b-000000000c2a 12033 1726867214.16955: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000c2a 12033 1726867214.16959: WORKER PROCESS EXITING 12033 1726867214.17123: no more pending results, returning what we have 12033 1726867214.17127: results queue empty 12033 1726867214.17128: checking for any_errors_fatal 12033 1726867214.17130: done checking for any_errors_fatal 12033 1726867214.17130: checking for max_fail_percentage 12033 1726867214.17132: done checking for max_fail_percentage 12033 1726867214.17133: checking to see if all hosts have failed and the running result is not ok 12033 1726867214.17134: done checking to see if all hosts have failed 12033 1726867214.17135: getting the remaining hosts for this loop 12033 1726867214.17137: done getting the remaining hosts for this loop 12033 1726867214.17140: getting the next task for host managed_node3 12033 1726867214.17148: done getting next task for host managed_node3 12033 1726867214.17151: ^ task is: TASK: Include the task 'assert_IPv4_present.yml' 12033 1726867214.17156: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867214.17160: getting variables 12033 1726867214.17161: in VariableManager get_vars() 12033 1726867214.17516: Calling all_inventory to load vars for managed_node3 12033 1726867214.17519: Calling groups_inventory to load vars for managed_node3 12033 1726867214.17522: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867214.17532: Calling all_plugins_play to load vars for managed_node3 12033 1726867214.17535: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867214.17538: Calling groups_plugins_play to load vars for managed_node3 12033 1726867214.19134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867214.20798: done with get_vars() 12033 1726867214.20825: done getting variables TASK [Include the task 'assert_IPv4_present.yml'] ****************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:11 Friday 20 September 2024 17:20:14 -0400 (0:00:01.737) 0:00:53.325 ****** 12033 1726867214.20929: entering _queue_task() for managed_node3/include_tasks 12033 1726867214.21387: worker is 1 (out of 1 available) 12033 1726867214.21405: exiting _queue_task() for managed_node3/include_tasks 12033 1726867214.21416: done queuing things up, now waiting for results queue to drain 12033 1726867214.21417: waiting for pending results... 12033 1726867214.21888: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv4_present.yml' 12033 1726867214.21906: in run() - task 0affcac9-a3a5-74bb-502b-000000000c2c 12033 1726867214.21929: variable 'ansible_search_path' from source: unknown 12033 1726867214.21937: variable 'ansible_search_path' from source: unknown 12033 1726867214.21983: calling self._execute() 12033 1726867214.22093: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867214.22196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867214.22199: variable 'omit' from source: magic vars 12033 1726867214.22542: variable 'ansible_distribution_major_version' from source: facts 12033 1726867214.22629: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867214.22632: _execute() done 12033 1726867214.22635: dumping result to json 12033 1726867214.22642: done dumping result, returning 12033 1726867214.22646: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv4_present.yml' [0affcac9-a3a5-74bb-502b-000000000c2c] 12033 1726867214.22648: sending task result for task 0affcac9-a3a5-74bb-502b-000000000c2c 12033 1726867214.22759: no more pending results, returning what we have 12033 1726867214.22764: in VariableManager get_vars() 12033 1726867214.22820: Calling all_inventory to load vars for managed_node3 12033 1726867214.22823: Calling groups_inventory to load vars for managed_node3 12033 1726867214.22826: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867214.22839: Calling all_plugins_play to load vars for managed_node3 12033 1726867214.22842: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867214.22845: Calling groups_plugins_play to load vars for managed_node3 12033 1726867214.23593: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000c2c 12033 1726867214.23596: WORKER PROCESS EXITING 12033 1726867214.24741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867214.26520: done with get_vars() 12033 1726867214.26539: variable 'ansible_search_path' from source: unknown 12033 1726867214.26541: variable 'ansible_search_path' from source: unknown 12033 1726867214.26550: variable 'item' from source: include params 12033 1726867214.26666: variable 'item' from source: include params 12033 1726867214.26707: we have included files to process 12033 1726867214.26708: generating all_blocks data 12033 1726867214.26710: done generating all_blocks data 12033 1726867214.26715: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 12033 1726867214.26716: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 12033 1726867214.26718: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 12033 1726867214.26930: done processing included file 12033 1726867214.26932: iterating over new_blocks loaded from include file 12033 1726867214.26933: in VariableManager get_vars() 12033 1726867214.26954: done with get_vars() 12033 1726867214.26956: filtering new block on tags 12033 1726867214.26983: done filtering new block on tags 12033 1726867214.26985: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml for managed_node3 12033 1726867214.26991: extending task lists for all hosts with included blocks 12033 1726867214.27429: done extending task lists 12033 1726867214.27431: done processing included files 12033 1726867214.27431: results queue empty 12033 1726867214.27432: checking for any_errors_fatal 12033 1726867214.27556: done checking for any_errors_fatal 12033 1726867214.27558: checking for max_fail_percentage 12033 1726867214.27559: done checking for max_fail_percentage 12033 1726867214.27560: checking to see if all hosts have failed and the running result is not ok 12033 1726867214.27561: done checking to see if all hosts have failed 12033 1726867214.27562: getting the remaining hosts for this loop 12033 1726867214.27563: done getting the remaining hosts for this loop 12033 1726867214.27566: getting the next task for host managed_node3 12033 1726867214.27570: done getting next task for host managed_node3 12033 1726867214.27573: ^ task is: TASK: ** TEST check IPv4 12033 1726867214.27576: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867214.27580: getting variables 12033 1726867214.27581: in VariableManager get_vars() 12033 1726867214.27594: Calling all_inventory to load vars for managed_node3 12033 1726867214.27596: Calling groups_inventory to load vars for managed_node3 12033 1726867214.27598: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867214.27606: Calling all_plugins_play to load vars for managed_node3 12033 1726867214.27608: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867214.27611: Calling groups_plugins_play to load vars for managed_node3 12033 1726867214.30189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867214.32060: done with get_vars() 12033 1726867214.32084: done getting variables 12033 1726867214.32140: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml:3 Friday 20 September 2024 17:20:14 -0400 (0:00:00.112) 0:00:53.437 ****** 12033 1726867214.32173: entering _queue_task() for managed_node3/command 12033 1726867214.32521: worker is 1 (out of 1 available) 12033 1726867214.32540: exiting _queue_task() for managed_node3/command 12033 1726867214.32554: done queuing things up, now waiting for results queue to drain 12033 1726867214.32555: waiting for pending results... 12033 1726867214.32790: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 12033 1726867214.32927: in run() - task 0affcac9-a3a5-74bb-502b-000000000da6 12033 1726867214.32950: variable 'ansible_search_path' from source: unknown 12033 1726867214.32963: variable 'ansible_search_path' from source: unknown 12033 1726867214.33016: calling self._execute() 12033 1726867214.33121: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867214.33133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867214.33147: variable 'omit' from source: magic vars 12033 1726867214.33527: variable 'ansible_distribution_major_version' from source: facts 12033 1726867214.33545: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867214.33629: variable 'omit' from source: magic vars 12033 1726867214.33632: variable 'omit' from source: magic vars 12033 1726867214.33780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867214.35934: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867214.36004: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867214.36052: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867214.36094: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867214.36129: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867214.36212: variable 'interface' from source: include params 12033 1726867214.36221: variable 'controller_device' from source: play vars 12033 1726867214.36295: variable 'controller_device' from source: play vars 12033 1726867214.36324: variable 'omit' from source: magic vars 12033 1726867214.36382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867214.36395: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867214.36416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867214.36436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867214.36560: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867214.36564: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867214.36566: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867214.36569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867214.36612: Set connection var ansible_pipelining to False 12033 1726867214.36626: Set connection var ansible_shell_executable to /bin/sh 12033 1726867214.36639: Set connection var ansible_timeout to 10 12033 1726867214.36650: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867214.36656: Set connection var ansible_connection to ssh 12033 1726867214.36665: Set connection var ansible_shell_type to sh 12033 1726867214.36698: variable 'ansible_shell_executable' from source: unknown 12033 1726867214.36707: variable 'ansible_connection' from source: unknown 12033 1726867214.36713: variable 'ansible_module_compression' from source: unknown 12033 1726867214.36719: variable 'ansible_shell_type' from source: unknown 12033 1726867214.36725: variable 'ansible_shell_executable' from source: unknown 12033 1726867214.36731: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867214.36738: variable 'ansible_pipelining' from source: unknown 12033 1726867214.36743: variable 'ansible_timeout' from source: unknown 12033 1726867214.36750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867214.36856: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867214.36871: variable 'omit' from source: magic vars 12033 1726867214.36884: starting attempt loop 12033 1726867214.36899: running the handler 12033 1726867214.36982: _low_level_execute_command(): starting 12033 1726867214.36986: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867214.37629: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867214.37693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867214.37751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867214.37768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867214.37795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867214.38076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867214.39690: stdout chunk (state=3): >>>/root <<< 12033 1726867214.39831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867214.39835: stdout chunk (state=3): >>><<< 12033 1726867214.39837: stderr chunk (state=3): >>><<< 12033 1726867214.39856: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867214.40025: _low_level_execute_command(): starting 12033 1726867214.40036: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867214.3987286-14630-76258379048322 `" && echo ansible-tmp-1726867214.3987286-14630-76258379048322="` echo /root/.ansible/tmp/ansible-tmp-1726867214.3987286-14630-76258379048322 `" ) && sleep 0' 12033 1726867214.40832: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867214.40844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867214.40859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867214.40876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867214.40896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867214.40952: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867214.41007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867214.41024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867214.41174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867214.41240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867214.43585: stdout chunk (state=3): >>>ansible-tmp-1726867214.3987286-14630-76258379048322=/root/.ansible/tmp/ansible-tmp-1726867214.3987286-14630-76258379048322 <<< 12033 1726867214.43588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867214.43590: stdout chunk (state=3): >>><<< 12033 1726867214.43592: stderr chunk (state=3): >>><<< 12033 1726867214.43595: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867214.3987286-14630-76258379048322=/root/.ansible/tmp/ansible-tmp-1726867214.3987286-14630-76258379048322 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867214.43597: variable 'ansible_module_compression' from source: unknown 12033 1726867214.43599: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867214.43601: variable 'ansible_facts' from source: unknown 12033 1726867214.43603: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867214.3987286-14630-76258379048322/AnsiballZ_command.py 12033 1726867214.43923: Sending initial data 12033 1726867214.43933: Sent initial data (155 bytes) 12033 1726867214.45096: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867214.45208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867214.45228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867214.45337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867214.46854: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867214.46895: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867214.46937: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpkv9sm5ai /root/.ansible/tmp/ansible-tmp-1726867214.3987286-14630-76258379048322/AnsiballZ_command.py <<< 12033 1726867214.46941: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867214.3987286-14630-76258379048322/AnsiballZ_command.py" <<< 12033 1726867214.47101: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpkv9sm5ai" to remote "/root/.ansible/tmp/ansible-tmp-1726867214.3987286-14630-76258379048322/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867214.3987286-14630-76258379048322/AnsiballZ_command.py" <<< 12033 1726867214.48347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867214.48373: stderr chunk (state=3): >>><<< 12033 1726867214.48376: stdout chunk (state=3): >>><<< 12033 1726867214.48403: done transferring module to remote 12033 1726867214.48411: _low_level_execute_command(): starting 12033 1726867214.48416: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867214.3987286-14630-76258379048322/ /root/.ansible/tmp/ansible-tmp-1726867214.3987286-14630-76258379048322/AnsiballZ_command.py && sleep 0' 12033 1726867214.49728: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867214.49735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867214.49884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867214.49935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867214.50020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867214.51828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867214.51856: stderr chunk (state=3): >>><<< 12033 1726867214.51859: stdout chunk (state=3): >>><<< 12033 1726867214.51875: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867214.51880: _low_level_execute_command(): starting 12033 1726867214.51886: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867214.3987286-14630-76258379048322/AnsiballZ_command.py && sleep 0' 12033 1726867214.53103: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867214.53188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867214.53191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867214.53384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867214.53388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867214.53391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867214.53393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867214.53395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867214.53471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867214.69139: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 17:20:14.685305", "end": "2024-09-20 17:20:14.688973", "delta": "0:00:00.003668", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867214.70602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867214.70739: stderr chunk (state=3): >>>Shared connection to 10.31.15.68 closed. <<< 12033 1726867214.70743: stdout chunk (state=3): >>><<< 12033 1726867214.70745: stderr chunk (state=3): >>><<< 12033 1726867214.70764: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 17:20:14.685305", "end": "2024-09-20 17:20:14.688973", "delta": "0:00:00.003668", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867214.70804: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867214.3987286-14630-76258379048322/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867214.70808: _low_level_execute_command(): starting 12033 1726867214.70815: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867214.3987286-14630-76258379048322/ > /dev/null 2>&1 && sleep 0' 12033 1726867214.72050: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867214.72056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867214.72256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867214.72260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867214.72292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867214.72804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867214.72857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867214.74927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867214.75007: stderr chunk (state=3): >>><<< 12033 1726867214.75011: stdout chunk (state=3): >>><<< 12033 1726867214.75029: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867214.75035: handler run complete 12033 1726867214.75060: Evaluated conditional (False): False 12033 1726867214.75206: variable 'address' from source: include params 12033 1726867214.75210: variable 'result' from source: set_fact 12033 1726867214.75228: Evaluated conditional (address in result.stdout): False 12033 1726867214.75231: Retrying task, attempt 1 of 21 FAILED - RETRYING: [managed_node3]: ** TEST check IPv4 (20 retries left). 12033 1726867216.75354: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867216.75360: running the handler 12033 1726867216.75367: _low_level_execute_command(): starting 12033 1726867216.75370: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867216.75838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867216.75842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867216.75844: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867216.75847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867216.75887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867216.75899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867216.75960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867216.77664: stdout chunk (state=3): >>>/root <<< 12033 1726867216.77761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867216.77789: stderr chunk (state=3): >>><<< 12033 1726867216.77792: stdout chunk (state=3): >>><<< 12033 1726867216.77808: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867216.77816: _low_level_execute_command(): starting 12033 1726867216.77821: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867216.7780728-14630-118924190204136 `" && echo ansible-tmp-1726867216.7780728-14630-118924190204136="` echo /root/.ansible/tmp/ansible-tmp-1726867216.7780728-14630-118924190204136 `" ) && sleep 0' 12033 1726867216.78238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867216.78241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867216.78244: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867216.78251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867216.78253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867216.78307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867216.78310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867216.78312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867216.78351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867216.80255: stdout chunk (state=3): >>>ansible-tmp-1726867216.7780728-14630-118924190204136=/root/.ansible/tmp/ansible-tmp-1726867216.7780728-14630-118924190204136 <<< 12033 1726867216.80366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867216.80387: stderr chunk (state=3): >>><<< 12033 1726867216.80390: stdout chunk (state=3): >>><<< 12033 1726867216.80404: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867216.7780728-14630-118924190204136=/root/.ansible/tmp/ansible-tmp-1726867216.7780728-14630-118924190204136 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867216.80425: variable 'ansible_module_compression' from source: unknown 12033 1726867216.80455: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867216.80470: variable 'ansible_facts' from source: unknown 12033 1726867216.80515: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867216.7780728-14630-118924190204136/AnsiballZ_command.py 12033 1726867216.80600: Sending initial data 12033 1726867216.80603: Sent initial data (156 bytes) 12033 1726867216.81023: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867216.81026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867216.81028: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867216.81030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867216.81088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867216.81090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867216.81130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867216.82696: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867216.82766: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867216.82829: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpgrdp3ymy /root/.ansible/tmp/ansible-tmp-1726867216.7780728-14630-118924190204136/AnsiballZ_command.py <<< 12033 1726867216.82834: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867216.7780728-14630-118924190204136/AnsiballZ_command.py" <<< 12033 1726867216.82907: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpgrdp3ymy" to remote "/root/.ansible/tmp/ansible-tmp-1726867216.7780728-14630-118924190204136/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867216.7780728-14630-118924190204136/AnsiballZ_command.py" <<< 12033 1726867216.83475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867216.83512: stderr chunk (state=3): >>><<< 12033 1726867216.83515: stdout chunk (state=3): >>><<< 12033 1726867216.83540: done transferring module to remote 12033 1726867216.83549: _low_level_execute_command(): starting 12033 1726867216.83554: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867216.7780728-14630-118924190204136/ /root/.ansible/tmp/ansible-tmp-1726867216.7780728-14630-118924190204136/AnsiballZ_command.py && sleep 0' 12033 1726867216.83981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867216.84054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867216.84058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867216.84076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867216.84086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867216.84155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867216.85991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867216.85994: stdout chunk (state=3): >>><<< 12033 1726867216.86000: stderr chunk (state=3): >>><<< 12033 1726867216.86017: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867216.86020: _low_level_execute_command(): starting 12033 1726867216.86025: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867216.7780728-14630-118924190204136/AnsiballZ_command.py && sleep 0' 12033 1726867216.87135: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867216.87156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867216.87159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867216.87182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867216.87186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867216.87188: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867216.87193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867216.87265: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867216.87268: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867216.87270: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867216.87272: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867216.87274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867216.87276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867216.87280: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867216.87282: stderr chunk (state=3): >>>debug2: match found <<< 12033 1726867216.87284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867216.87334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867216.87424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867216.87594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867216.87984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867217.03836: stdout chunk (state=3): >>> {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.76/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 238sec preferred_lft 238sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 17:20:17.031971", "end": "2024-09-20 17:20:17.035968", "delta": "0:00:00.003997", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867217.05507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867217.05512: stdout chunk (state=3): >>><<< 12033 1726867217.05514: stderr chunk (state=3): >>><<< 12033 1726867217.05517: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.76/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 238sec preferred_lft 238sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 17:20:17.031971", "end": "2024-09-20 17:20:17.035968", "delta": "0:00:00.003997", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867217.05520: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867216.7780728-14630-118924190204136/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867217.05526: _low_level_execute_command(): starting 12033 1726867217.05528: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867216.7780728-14630-118924190204136/ > /dev/null 2>&1 && sleep 0' 12033 1726867217.06075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867217.06104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867217.06119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867217.06131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867217.06142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867217.06194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867217.06262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867217.06307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867217.06353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867217.08385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867217.08388: stdout chunk (state=3): >>><<< 12033 1726867217.08391: stderr chunk (state=3): >>><<< 12033 1726867217.08393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867217.08395: handler run complete 12033 1726867217.08397: Evaluated conditional (False): False 12033 1726867217.08450: variable 'address' from source: include params 12033 1726867217.08534: variable 'result' from source: set_fact 12033 1726867217.08554: Evaluated conditional (address in result.stdout): True 12033 1726867217.08586: attempt loop complete, returning result 12033 1726867217.08597: _execute() done 12033 1726867217.08607: dumping result to json 12033 1726867217.08630: done dumping result, returning 12033 1726867217.08712: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 [0affcac9-a3a5-74bb-502b-000000000da6] 12033 1726867217.08715: sending task result for task 0affcac9-a3a5-74bb-502b-000000000da6 12033 1726867217.08836: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000da6 ok: [managed_node3] => { "attempts": 2, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003997", "end": "2024-09-20 17:20:17.035968", "rc": 0, "start": "2024-09-20 17:20:17.031971" } STDOUT: 24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.76/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 238sec preferred_lft 238sec 12033 1726867217.08900: no more pending results, returning what we have 12033 1726867217.08905: results queue empty 12033 1726867217.08906: checking for any_errors_fatal 12033 1726867217.08907: done checking for any_errors_fatal 12033 1726867217.08907: checking for max_fail_percentage 12033 1726867217.08909: done checking for max_fail_percentage 12033 1726867217.08910: checking to see if all hosts have failed and the running result is not ok 12033 1726867217.08911: done checking to see if all hosts have failed 12033 1726867217.08912: getting the remaining hosts for this loop 12033 1726867217.08913: done getting the remaining hosts for this loop 12033 1726867217.08916: getting the next task for host managed_node3 12033 1726867217.08924: done getting next task for host managed_node3 12033 1726867217.08926: ^ task is: TASK: Include the task 'assert_IPv6_present.yml' 12033 1726867217.08929: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867217.08933: getting variables 12033 1726867217.08934: in VariableManager get_vars() 12033 1726867217.08980: Calling all_inventory to load vars for managed_node3 12033 1726867217.08983: Calling groups_inventory to load vars for managed_node3 12033 1726867217.08985: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867217.08995: Calling all_plugins_play to load vars for managed_node3 12033 1726867217.08998: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867217.09000: Calling groups_plugins_play to load vars for managed_node3 12033 1726867217.09607: WORKER PROCESS EXITING 12033 1726867217.10301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867217.11775: done with get_vars() 12033 1726867217.11798: done getting variables TASK [Include the task 'assert_IPv6_present.yml'] ****************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:16 Friday 20 September 2024 17:20:17 -0400 (0:00:02.797) 0:00:56.234 ****** 12033 1726867217.11892: entering _queue_task() for managed_node3/include_tasks 12033 1726867217.12200: worker is 1 (out of 1 available) 12033 1726867217.12218: exiting _queue_task() for managed_node3/include_tasks 12033 1726867217.12232: done queuing things up, now waiting for results queue to drain 12033 1726867217.12234: waiting for pending results... 12033 1726867217.12539: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv6_present.yml' 12033 1726867217.12639: in run() - task 0affcac9-a3a5-74bb-502b-000000000c2d 12033 1726867217.12744: variable 'ansible_search_path' from source: unknown 12033 1726867217.12748: variable 'ansible_search_path' from source: unknown 12033 1726867217.12750: calling self._execute() 12033 1726867217.12811: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867217.12822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867217.12834: variable 'omit' from source: magic vars 12033 1726867217.13218: variable 'ansible_distribution_major_version' from source: facts 12033 1726867217.13237: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867217.13257: _execute() done 12033 1726867217.13261: dumping result to json 12033 1726867217.13264: done dumping result, returning 12033 1726867217.13266: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv6_present.yml' [0affcac9-a3a5-74bb-502b-000000000c2d] 12033 1726867217.13268: sending task result for task 0affcac9-a3a5-74bb-502b-000000000c2d 12033 1726867217.13465: no more pending results, returning what we have 12033 1726867217.13469: in VariableManager get_vars() 12033 1726867217.13510: Calling all_inventory to load vars for managed_node3 12033 1726867217.13513: Calling groups_inventory to load vars for managed_node3 12033 1726867217.13515: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867217.13523: Calling all_plugins_play to load vars for managed_node3 12033 1726867217.13526: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867217.13528: Calling groups_plugins_play to load vars for managed_node3 12033 1726867217.14047: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000c2d 12033 1726867217.14051: WORKER PROCESS EXITING 12033 1726867217.14256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867217.15289: done with get_vars() 12033 1726867217.15306: variable 'ansible_search_path' from source: unknown 12033 1726867217.15307: variable 'ansible_search_path' from source: unknown 12033 1726867217.15314: variable 'item' from source: include params 12033 1726867217.15416: variable 'item' from source: include params 12033 1726867217.15448: we have included files to process 12033 1726867217.15449: generating all_blocks data 12033 1726867217.15451: done generating all_blocks data 12033 1726867217.15455: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 12033 1726867217.15456: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 12033 1726867217.15458: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 12033 1726867217.15639: done processing included file 12033 1726867217.15642: iterating over new_blocks loaded from include file 12033 1726867217.15644: in VariableManager get_vars() 12033 1726867217.15667: done with get_vars() 12033 1726867217.15669: filtering new block on tags 12033 1726867217.15696: done filtering new block on tags 12033 1726867217.15699: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml for managed_node3 12033 1726867217.15703: extending task lists for all hosts with included blocks 12033 1726867217.16036: done extending task lists 12033 1726867217.16038: done processing included files 12033 1726867217.16038: results queue empty 12033 1726867217.16039: checking for any_errors_fatal 12033 1726867217.16044: done checking for any_errors_fatal 12033 1726867217.16045: checking for max_fail_percentage 12033 1726867217.16047: done checking for max_fail_percentage 12033 1726867217.16047: checking to see if all hosts have failed and the running result is not ok 12033 1726867217.16048: done checking to see if all hosts have failed 12033 1726867217.16049: getting the remaining hosts for this loop 12033 1726867217.16050: done getting the remaining hosts for this loop 12033 1726867217.16055: getting the next task for host managed_node3 12033 1726867217.16061: done getting next task for host managed_node3 12033 1726867217.16063: ^ task is: TASK: ** TEST check IPv6 12033 1726867217.16067: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867217.16069: getting variables 12033 1726867217.16069: in VariableManager get_vars() 12033 1726867217.16086: Calling all_inventory to load vars for managed_node3 12033 1726867217.16088: Calling groups_inventory to load vars for managed_node3 12033 1726867217.16090: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867217.16094: Calling all_plugins_play to load vars for managed_node3 12033 1726867217.16098: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867217.16101: Calling groups_plugins_play to load vars for managed_node3 12033 1726867217.17163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867217.18128: done with get_vars() 12033 1726867217.18142: done getting variables 12033 1726867217.18169: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml:3 Friday 20 September 2024 17:20:17 -0400 (0:00:00.062) 0:00:56.297 ****** 12033 1726867217.18192: entering _queue_task() for managed_node3/command 12033 1726867217.18407: worker is 1 (out of 1 available) 12033 1726867217.18420: exiting _queue_task() for managed_node3/command 12033 1726867217.18432: done queuing things up, now waiting for results queue to drain 12033 1726867217.18433: waiting for pending results... 12033 1726867217.18676: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 12033 1726867217.18684: in run() - task 0affcac9-a3a5-74bb-502b-000000000dc9 12033 1726867217.18692: variable 'ansible_search_path' from source: unknown 12033 1726867217.18696: variable 'ansible_search_path' from source: unknown 12033 1726867217.18725: calling self._execute() 12033 1726867217.18797: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867217.18803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867217.18813: variable 'omit' from source: magic vars 12033 1726867217.19072: variable 'ansible_distribution_major_version' from source: facts 12033 1726867217.19083: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867217.19091: variable 'omit' from source: magic vars 12033 1726867217.19129: variable 'omit' from source: magic vars 12033 1726867217.19245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867217.20966: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867217.21014: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867217.21041: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867217.21068: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867217.21089: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867217.21149: variable 'controller_device' from source: play vars 12033 1726867217.21166: variable 'omit' from source: magic vars 12033 1726867217.21189: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867217.21212: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867217.21228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867217.21240: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867217.21250: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867217.21273: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867217.21276: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867217.21280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867217.21349: Set connection var ansible_pipelining to False 12033 1726867217.21356: Set connection var ansible_shell_executable to /bin/sh 12033 1726867217.21362: Set connection var ansible_timeout to 10 12033 1726867217.21367: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867217.21370: Set connection var ansible_connection to ssh 12033 1726867217.21374: Set connection var ansible_shell_type to sh 12033 1726867217.21392: variable 'ansible_shell_executable' from source: unknown 12033 1726867217.21395: variable 'ansible_connection' from source: unknown 12033 1726867217.21397: variable 'ansible_module_compression' from source: unknown 12033 1726867217.21400: variable 'ansible_shell_type' from source: unknown 12033 1726867217.21402: variable 'ansible_shell_executable' from source: unknown 12033 1726867217.21407: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867217.21411: variable 'ansible_pipelining' from source: unknown 12033 1726867217.21413: variable 'ansible_timeout' from source: unknown 12033 1726867217.21417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867217.21490: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867217.21498: variable 'omit' from source: magic vars 12033 1726867217.21502: starting attempt loop 12033 1726867217.21508: running the handler 12033 1726867217.21521: _low_level_execute_command(): starting 12033 1726867217.21527: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867217.21990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867217.21993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867217.21996: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867217.21998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867217.22053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867217.22060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867217.22062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867217.22109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867217.23782: stdout chunk (state=3): >>>/root <<< 12033 1726867217.23884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867217.23908: stderr chunk (state=3): >>><<< 12033 1726867217.23911: stdout chunk (state=3): >>><<< 12033 1726867217.23929: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867217.23943: _low_level_execute_command(): starting 12033 1726867217.23947: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867217.2392824-14730-124930254413800 `" && echo ansible-tmp-1726867217.2392824-14730-124930254413800="` echo /root/.ansible/tmp/ansible-tmp-1726867217.2392824-14730-124930254413800 `" ) && sleep 0' 12033 1726867217.24333: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867217.24336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867217.24348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867217.24404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867217.24408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867217.24503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867217.26455: stdout chunk (state=3): >>>ansible-tmp-1726867217.2392824-14730-124930254413800=/root/.ansible/tmp/ansible-tmp-1726867217.2392824-14730-124930254413800 <<< 12033 1726867217.26562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867217.26584: stderr chunk (state=3): >>><<< 12033 1726867217.26588: stdout chunk (state=3): >>><<< 12033 1726867217.26604: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867217.2392824-14730-124930254413800=/root/.ansible/tmp/ansible-tmp-1726867217.2392824-14730-124930254413800 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867217.26623: variable 'ansible_module_compression' from source: unknown 12033 1726867217.26662: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867217.26687: variable 'ansible_facts' from source: unknown 12033 1726867217.26740: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867217.2392824-14730-124930254413800/AnsiballZ_command.py 12033 1726867217.26840: Sending initial data 12033 1726867217.26843: Sent initial data (156 bytes) 12033 1726867217.27252: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867217.27255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867217.27257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867217.27259: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867217.27261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867217.27318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867217.27321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867217.27361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867217.28905: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867217.28943: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867217.28988: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpei3b4pcd /root/.ansible/tmp/ansible-tmp-1726867217.2392824-14730-124930254413800/AnsiballZ_command.py <<< 12033 1726867217.28991: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867217.2392824-14730-124930254413800/AnsiballZ_command.py" <<< 12033 1726867217.29032: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpei3b4pcd" to remote "/root/.ansible/tmp/ansible-tmp-1726867217.2392824-14730-124930254413800/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867217.2392824-14730-124930254413800/AnsiballZ_command.py" <<< 12033 1726867217.29568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867217.29605: stderr chunk (state=3): >>><<< 12033 1726867217.29608: stdout chunk (state=3): >>><<< 12033 1726867217.29642: done transferring module to remote 12033 1726867217.29651: _low_level_execute_command(): starting 12033 1726867217.29655: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867217.2392824-14730-124930254413800/ /root/.ansible/tmp/ansible-tmp-1726867217.2392824-14730-124930254413800/AnsiballZ_command.py && sleep 0' 12033 1726867217.30038: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867217.30063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867217.30069: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867217.30072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867217.30125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867217.30130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867217.30132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867217.30173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867217.31916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867217.31936: stderr chunk (state=3): >>><<< 12033 1726867217.31939: stdout chunk (state=3): >>><<< 12033 1726867217.31952: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867217.31955: _low_level_execute_command(): starting 12033 1726867217.31959: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867217.2392824-14730-124930254413800/AnsiballZ_command.py && sleep 0' 12033 1726867217.32345: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867217.32348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867217.32350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867217.32355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867217.32357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867217.32405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867217.32412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867217.32459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867217.48176: stdout chunk (state=3): >>> {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::17d/128 scope global dynamic noprefixroute \n valid_lft 238sec preferred_lft 238sec\n inet6 2001:db8::748a:d5ff:fe22:a4c0/64 scope global dynamic noprefixroute \n valid_lft 1796sec preferred_lft 1796sec\n inet6 fe80::748a:d5ff:fe22:a4c0/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 17:20:17.475798", "end": "2024-09-20 17:20:17.479365", "delta": "0:00:00.003567", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867217.49930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867217.49934: stdout chunk (state=3): >>><<< 12033 1726867217.49940: stderr chunk (state=3): >>><<< 12033 1726867217.49955: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::17d/128 scope global dynamic noprefixroute \n valid_lft 238sec preferred_lft 238sec\n inet6 2001:db8::748a:d5ff:fe22:a4c0/64 scope global dynamic noprefixroute \n valid_lft 1796sec preferred_lft 1796sec\n inet6 fe80::748a:d5ff:fe22:a4c0/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 17:20:17.475798", "end": "2024-09-20 17:20:17.479365", "delta": "0:00:00.003567", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867217.49991: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867217.2392824-14730-124930254413800/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867217.49999: _low_level_execute_command(): starting 12033 1726867217.50004: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867217.2392824-14730-124930254413800/ > /dev/null 2>&1 && sleep 0' 12033 1726867217.50444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867217.50448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867217.50451: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867217.50453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867217.50455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867217.50500: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867217.50503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867217.50558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867217.52384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867217.52408: stderr chunk (state=3): >>><<< 12033 1726867217.52411: stdout chunk (state=3): >>><<< 12033 1726867217.52423: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867217.52429: handler run complete 12033 1726867217.52450: Evaluated conditional (False): False 12033 1726867217.52559: variable 'address' from source: include params 12033 1726867217.52566: variable 'result' from source: set_fact 12033 1726867217.52581: Evaluated conditional (address in result.stdout): True 12033 1726867217.52593: attempt loop complete, returning result 12033 1726867217.52596: _execute() done 12033 1726867217.52599: dumping result to json 12033 1726867217.52605: done dumping result, returning 12033 1726867217.52611: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 [0affcac9-a3a5-74bb-502b-000000000dc9] 12033 1726867217.52616: sending task result for task 0affcac9-a3a5-74bb-502b-000000000dc9 12033 1726867217.52714: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000dc9 12033 1726867217.52717: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003567", "end": "2024-09-20 17:20:17.479365", "rc": 0, "start": "2024-09-20 17:20:17.475798" } STDOUT: 24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::17d/128 scope global dynamic noprefixroute valid_lft 238sec preferred_lft 238sec inet6 2001:db8::748a:d5ff:fe22:a4c0/64 scope global dynamic noprefixroute valid_lft 1796sec preferred_lft 1796sec inet6 fe80::748a:d5ff:fe22:a4c0/64 scope link noprefixroute valid_lft forever preferred_lft forever 12033 1726867217.52790: no more pending results, returning what we have 12033 1726867217.52794: results queue empty 12033 1726867217.52795: checking for any_errors_fatal 12033 1726867217.52796: done checking for any_errors_fatal 12033 1726867217.52796: checking for max_fail_percentage 12033 1726867217.52798: done checking for max_fail_percentage 12033 1726867217.52799: checking to see if all hosts have failed and the running result is not ok 12033 1726867217.52800: done checking to see if all hosts have failed 12033 1726867217.52803: getting the remaining hosts for this loop 12033 1726867217.52805: done getting the remaining hosts for this loop 12033 1726867217.52808: getting the next task for host managed_node3 12033 1726867217.52817: done getting next task for host managed_node3 12033 1726867217.52820: ^ task is: TASK: Conditional asserts 12033 1726867217.52823: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867217.52828: getting variables 12033 1726867217.52829: in VariableManager get_vars() 12033 1726867217.52871: Calling all_inventory to load vars for managed_node3 12033 1726867217.52873: Calling groups_inventory to load vars for managed_node3 12033 1726867217.52876: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867217.52892: Calling all_plugins_play to load vars for managed_node3 12033 1726867217.52895: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867217.52898: Calling groups_plugins_play to load vars for managed_node3 12033 1726867217.53718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867217.54570: done with get_vars() 12033 1726867217.54588: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 17:20:17 -0400 (0:00:00.364) 0:00:56.662 ****** 12033 1726867217.54656: entering _queue_task() for managed_node3/include_tasks 12033 1726867217.54869: worker is 1 (out of 1 available) 12033 1726867217.54884: exiting _queue_task() for managed_node3/include_tasks 12033 1726867217.54896: done queuing things up, now waiting for results queue to drain 12033 1726867217.54899: waiting for pending results... 12033 1726867217.55076: running TaskExecutor() for managed_node3/TASK: Conditional asserts 12033 1726867217.55149: in run() - task 0affcac9-a3a5-74bb-502b-0000000008f0 12033 1726867217.55162: variable 'ansible_search_path' from source: unknown 12033 1726867217.55165: variable 'ansible_search_path' from source: unknown 12033 1726867217.55383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867217.57183: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867217.57237: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867217.57266: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867217.57294: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867217.57316: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867217.57374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867217.57398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867217.57418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867217.57446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867217.57457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867217.57586: dumping result to json 12033 1726867217.57589: done dumping result, returning 12033 1726867217.57596: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [0affcac9-a3a5-74bb-502b-0000000008f0] 12033 1726867217.57605: sending task result for task 0affcac9-a3a5-74bb-502b-0000000008f0 12033 1726867217.57690: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000008f0 12033 1726867217.57692: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "skipped_reason": "No items in the list" } 12033 1726867217.57754: no more pending results, returning what we have 12033 1726867217.57758: results queue empty 12033 1726867217.57759: checking for any_errors_fatal 12033 1726867217.57765: done checking for any_errors_fatal 12033 1726867217.57766: checking for max_fail_percentage 12033 1726867217.57768: done checking for max_fail_percentage 12033 1726867217.57769: checking to see if all hosts have failed and the running result is not ok 12033 1726867217.57770: done checking to see if all hosts have failed 12033 1726867217.57771: getting the remaining hosts for this loop 12033 1726867217.57772: done getting the remaining hosts for this loop 12033 1726867217.57776: getting the next task for host managed_node3 12033 1726867217.57784: done getting next task for host managed_node3 12033 1726867217.57786: ^ task is: TASK: Success in test '{{ lsr_description }}' 12033 1726867217.57789: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867217.57792: getting variables 12033 1726867217.57794: in VariableManager get_vars() 12033 1726867217.57933: Calling all_inventory to load vars for managed_node3 12033 1726867217.57936: Calling groups_inventory to load vars for managed_node3 12033 1726867217.57938: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867217.57947: Calling all_plugins_play to load vars for managed_node3 12033 1726867217.57949: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867217.57952: Calling groups_plugins_play to load vars for managed_node3 12033 1726867217.59348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867217.60913: done with get_vars() 12033 1726867217.60936: done getting variables 12033 1726867217.60994: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867217.61116: variable 'lsr_description' from source: include params TASK [Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.'] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 17:20:17 -0400 (0:00:00.064) 0:00:56.727 ****** 12033 1726867217.61146: entering _queue_task() for managed_node3/debug 12033 1726867217.61390: worker is 1 (out of 1 available) 12033 1726867217.61404: exiting _queue_task() for managed_node3/debug 12033 1726867217.61417: done queuing things up, now waiting for results queue to drain 12033 1726867217.61419: waiting for pending results... 12033 1726867217.61605: running TaskExecutor() for managed_node3/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' 12033 1726867217.61671: in run() - task 0affcac9-a3a5-74bb-502b-0000000008f1 12033 1726867217.61686: variable 'ansible_search_path' from source: unknown 12033 1726867217.61689: variable 'ansible_search_path' from source: unknown 12033 1726867217.61719: calling self._execute() 12033 1726867217.61794: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867217.61797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867217.61807: variable 'omit' from source: magic vars 12033 1726867217.62065: variable 'ansible_distribution_major_version' from source: facts 12033 1726867217.62078: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867217.62082: variable 'omit' from source: magic vars 12033 1726867217.62109: variable 'omit' from source: magic vars 12033 1726867217.62176: variable 'lsr_description' from source: include params 12033 1726867217.62194: variable 'omit' from source: magic vars 12033 1726867217.62227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867217.62252: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867217.62269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867217.62285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867217.62295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867217.62320: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867217.62323: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867217.62325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867217.62394: Set connection var ansible_pipelining to False 12033 1726867217.62404: Set connection var ansible_shell_executable to /bin/sh 12033 1726867217.62407: Set connection var ansible_timeout to 10 12033 1726867217.62413: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867217.62416: Set connection var ansible_connection to ssh 12033 1726867217.62420: Set connection var ansible_shell_type to sh 12033 1726867217.62436: variable 'ansible_shell_executable' from source: unknown 12033 1726867217.62439: variable 'ansible_connection' from source: unknown 12033 1726867217.62442: variable 'ansible_module_compression' from source: unknown 12033 1726867217.62447: variable 'ansible_shell_type' from source: unknown 12033 1726867217.62450: variable 'ansible_shell_executable' from source: unknown 12033 1726867217.62452: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867217.62457: variable 'ansible_pipelining' from source: unknown 12033 1726867217.62459: variable 'ansible_timeout' from source: unknown 12033 1726867217.62463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867217.62561: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867217.62569: variable 'omit' from source: magic vars 12033 1726867217.62574: starting attempt loop 12033 1726867217.62578: running the handler 12033 1726867217.62619: handler run complete 12033 1726867217.62628: attempt loop complete, returning result 12033 1726867217.62631: _execute() done 12033 1726867217.62633: dumping result to json 12033 1726867217.62636: done dumping result, returning 12033 1726867217.62643: done running TaskExecutor() for managed_node3/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' [0affcac9-a3a5-74bb-502b-0000000008f1] 12033 1726867217.62648: sending task result for task 0affcac9-a3a5-74bb-502b-0000000008f1 12033 1726867217.62731: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000008f1 12033 1726867217.62734: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: +++++ Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' +++++ 12033 1726867217.62782: no more pending results, returning what we have 12033 1726867217.62786: results queue empty 12033 1726867217.62787: checking for any_errors_fatal 12033 1726867217.62795: done checking for any_errors_fatal 12033 1726867217.62795: checking for max_fail_percentage 12033 1726867217.62798: done checking for max_fail_percentage 12033 1726867217.62798: checking to see if all hosts have failed and the running result is not ok 12033 1726867217.62799: done checking to see if all hosts have failed 12033 1726867217.62800: getting the remaining hosts for this loop 12033 1726867217.62804: done getting the remaining hosts for this loop 12033 1726867217.62807: getting the next task for host managed_node3 12033 1726867217.62814: done getting next task for host managed_node3 12033 1726867217.62816: ^ task is: TASK: Cleanup 12033 1726867217.62819: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867217.62824: getting variables 12033 1726867217.62825: in VariableManager get_vars() 12033 1726867217.62867: Calling all_inventory to load vars for managed_node3 12033 1726867217.62870: Calling groups_inventory to load vars for managed_node3 12033 1726867217.62872: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867217.62882: Calling all_plugins_play to load vars for managed_node3 12033 1726867217.62885: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867217.62888: Calling groups_plugins_play to load vars for managed_node3 12033 1726867217.63699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867217.65426: done with get_vars() 12033 1726867217.65446: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 17:20:17 -0400 (0:00:00.043) 0:00:56.771 ****** 12033 1726867217.65534: entering _queue_task() for managed_node3/include_tasks 12033 1726867217.65805: worker is 1 (out of 1 available) 12033 1726867217.65818: exiting _queue_task() for managed_node3/include_tasks 12033 1726867217.65829: done queuing things up, now waiting for results queue to drain 12033 1726867217.65831: waiting for pending results... 12033 1726867217.66227: running TaskExecutor() for managed_node3/TASK: Cleanup 12033 1726867217.66253: in run() - task 0affcac9-a3a5-74bb-502b-0000000008f5 12033 1726867217.66274: variable 'ansible_search_path' from source: unknown 12033 1726867217.66285: variable 'ansible_search_path' from source: unknown 12033 1726867217.66350: variable 'lsr_cleanup' from source: include params 12033 1726867217.66569: variable 'lsr_cleanup' from source: include params 12033 1726867217.66679: variable 'omit' from source: magic vars 12033 1726867217.66806: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867217.66822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867217.66836: variable 'omit' from source: magic vars 12033 1726867217.67091: variable 'ansible_distribution_major_version' from source: facts 12033 1726867217.67124: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867217.67127: variable 'item' from source: unknown 12033 1726867217.67185: variable 'item' from source: unknown 12033 1726867217.67219: variable 'item' from source: unknown 12033 1726867217.67381: variable 'item' from source: unknown 12033 1726867217.67483: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867217.67487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867217.67489: variable 'omit' from source: magic vars 12033 1726867217.67820: variable 'ansible_distribution_major_version' from source: facts 12033 1726867217.67823: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867217.67825: variable 'item' from source: unknown 12033 1726867217.67827: variable 'item' from source: unknown 12033 1726867217.67830: variable 'item' from source: unknown 12033 1726867217.67831: variable 'item' from source: unknown 12033 1726867217.67929: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867217.67941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867217.67955: variable 'omit' from source: magic vars 12033 1726867217.68106: variable 'ansible_distribution_major_version' from source: facts 12033 1726867217.68117: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867217.68126: variable 'item' from source: unknown 12033 1726867217.68252: variable 'item' from source: unknown 12033 1726867217.68256: variable 'item' from source: unknown 12033 1726867217.68289: variable 'item' from source: unknown 12033 1726867217.68468: dumping result to json 12033 1726867217.68471: done dumping result, returning 12033 1726867217.68473: done running TaskExecutor() for managed_node3/TASK: Cleanup [0affcac9-a3a5-74bb-502b-0000000008f5] 12033 1726867217.68476: sending task result for task 0affcac9-a3a5-74bb-502b-0000000008f5 12033 1726867217.68516: done sending task result for task 0affcac9-a3a5-74bb-502b-0000000008f5 12033 1726867217.68519: WORKER PROCESS EXITING 12033 1726867217.68601: no more pending results, returning what we have 12033 1726867217.68605: in VariableManager get_vars() 12033 1726867217.68646: Calling all_inventory to load vars for managed_node3 12033 1726867217.68649: Calling groups_inventory to load vars for managed_node3 12033 1726867217.68651: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867217.68662: Calling all_plugins_play to load vars for managed_node3 12033 1726867217.68665: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867217.68667: Calling groups_plugins_play to load vars for managed_node3 12033 1726867217.69998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867217.71410: done with get_vars() 12033 1726867217.71427: variable 'ansible_search_path' from source: unknown 12033 1726867217.71429: variable 'ansible_search_path' from source: unknown 12033 1726867217.71465: variable 'ansible_search_path' from source: unknown 12033 1726867217.71466: variable 'ansible_search_path' from source: unknown 12033 1726867217.71496: variable 'ansible_search_path' from source: unknown 12033 1726867217.71497: variable 'ansible_search_path' from source: unknown 12033 1726867217.71524: we have included files to process 12033 1726867217.71526: generating all_blocks data 12033 1726867217.71527: done generating all_blocks data 12033 1726867217.71531: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 12033 1726867217.71532: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 12033 1726867217.71535: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 12033 1726867217.71678: in VariableManager get_vars() 12033 1726867217.71704: done with get_vars() 12033 1726867217.71709: variable 'omit' from source: magic vars 12033 1726867217.71748: variable 'omit' from source: magic vars 12033 1726867217.71802: in VariableManager get_vars() 12033 1726867217.71818: done with get_vars() 12033 1726867217.71842: in VariableManager get_vars() 12033 1726867217.71861: done with get_vars() 12033 1726867217.71895: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12033 1726867217.72010: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12033 1726867217.72133: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12033 1726867217.72515: in VariableManager get_vars() 12033 1726867217.72540: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12033 1726867217.74444: done processing included file 12033 1726867217.74447: iterating over new_blocks loaded from include file 12033 1726867217.74448: in VariableManager get_vars() 12033 1726867217.74468: done with get_vars() 12033 1726867217.74469: filtering new block on tags 12033 1726867217.74769: done filtering new block on tags 12033 1726867217.74774: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml for managed_node3 => (item=tasks/cleanup_bond_profile+device.yml) 12033 1726867217.74781: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 12033 1726867217.74782: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 12033 1726867217.74785: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 12033 1726867217.75122: done processing included file 12033 1726867217.75124: iterating over new_blocks loaded from include file 12033 1726867217.75125: in VariableManager get_vars() 12033 1726867217.75145: done with get_vars() 12033 1726867217.75146: filtering new block on tags 12033 1726867217.75247: done filtering new block on tags 12033 1726867217.75250: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml for managed_node3 => (item=tasks/remove_test_interfaces_with_dhcp.yml) 12033 1726867217.75253: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 12033 1726867217.75259: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 12033 1726867217.75262: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 12033 1726867217.75594: done processing included file 12033 1726867217.75596: iterating over new_blocks loaded from include file 12033 1726867217.75597: in VariableManager get_vars() 12033 1726867217.75618: done with get_vars() 12033 1726867217.75620: filtering new block on tags 12033 1726867217.75649: done filtering new block on tags 12033 1726867217.75652: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 => (item=tasks/check_network_dns.yml) 12033 1726867217.75655: extending task lists for all hosts with included blocks 12033 1726867217.83156: done extending task lists 12033 1726867217.83158: done processing included files 12033 1726867217.83158: results queue empty 12033 1726867217.83159: checking for any_errors_fatal 12033 1726867217.83162: done checking for any_errors_fatal 12033 1726867217.83163: checking for max_fail_percentage 12033 1726867217.83164: done checking for max_fail_percentage 12033 1726867217.83165: checking to see if all hosts have failed and the running result is not ok 12033 1726867217.83166: done checking to see if all hosts have failed 12033 1726867217.83166: getting the remaining hosts for this loop 12033 1726867217.83168: done getting the remaining hosts for this loop 12033 1726867217.83170: getting the next task for host managed_node3 12033 1726867217.83174: done getting next task for host managed_node3 12033 1726867217.83179: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12033 1726867217.83182: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867217.83194: getting variables 12033 1726867217.83195: in VariableManager get_vars() 12033 1726867217.83217: Calling all_inventory to load vars for managed_node3 12033 1726867217.83219: Calling groups_inventory to load vars for managed_node3 12033 1726867217.83221: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867217.83226: Calling all_plugins_play to load vars for managed_node3 12033 1726867217.83228: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867217.83231: Calling groups_plugins_play to load vars for managed_node3 12033 1726867217.84345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867217.85898: done with get_vars() 12033 1726867217.85919: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:20:17 -0400 (0:00:00.204) 0:00:56.975 ****** 12033 1726867217.85998: entering _queue_task() for managed_node3/include_tasks 12033 1726867217.86350: worker is 1 (out of 1 available) 12033 1726867217.86360: exiting _queue_task() for managed_node3/include_tasks 12033 1726867217.86372: done queuing things up, now waiting for results queue to drain 12033 1726867217.86373: waiting for pending results... 12033 1726867217.86702: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12033 1726867217.86865: in run() - task 0affcac9-a3a5-74bb-502b-000000000e0c 12033 1726867217.86892: variable 'ansible_search_path' from source: unknown 12033 1726867217.86900: variable 'ansible_search_path' from source: unknown 12033 1726867217.86949: calling self._execute() 12033 1726867217.87151: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867217.87156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867217.87158: variable 'omit' from source: magic vars 12033 1726867217.87498: variable 'ansible_distribution_major_version' from source: facts 12033 1726867217.87517: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867217.87529: _execute() done 12033 1726867217.87538: dumping result to json 12033 1726867217.87545: done dumping result, returning 12033 1726867217.87556: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-74bb-502b-000000000e0c] 12033 1726867217.87566: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e0c 12033 1726867217.87737: no more pending results, returning what we have 12033 1726867217.87742: in VariableManager get_vars() 12033 1726867217.87798: Calling all_inventory to load vars for managed_node3 12033 1726867217.87801: Calling groups_inventory to load vars for managed_node3 12033 1726867217.87803: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867217.87815: Calling all_plugins_play to load vars for managed_node3 12033 1726867217.87819: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867217.87821: Calling groups_plugins_play to load vars for managed_node3 12033 1726867217.88493: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e0c 12033 1726867217.88496: WORKER PROCESS EXITING 12033 1726867217.90099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867217.91699: done with get_vars() 12033 1726867217.91723: variable 'ansible_search_path' from source: unknown 12033 1726867217.91724: variable 'ansible_search_path' from source: unknown 12033 1726867217.91762: we have included files to process 12033 1726867217.91763: generating all_blocks data 12033 1726867217.91765: done generating all_blocks data 12033 1726867217.91766: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12033 1726867217.91767: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12033 1726867217.91770: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12033 1726867217.92348: done processing included file 12033 1726867217.92350: iterating over new_blocks loaded from include file 12033 1726867217.92351: in VariableManager get_vars() 12033 1726867217.92384: done with get_vars() 12033 1726867217.92385: filtering new block on tags 12033 1726867217.92420: done filtering new block on tags 12033 1726867217.92423: in VariableManager get_vars() 12033 1726867217.92451: done with get_vars() 12033 1726867217.92452: filtering new block on tags 12033 1726867217.92499: done filtering new block on tags 12033 1726867217.92504: in VariableManager get_vars() 12033 1726867217.92532: done with get_vars() 12033 1726867217.92534: filtering new block on tags 12033 1726867217.92580: done filtering new block on tags 12033 1726867217.92583: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 12033 1726867217.92588: extending task lists for all hosts with included blocks 12033 1726867217.94434: done extending task lists 12033 1726867217.94436: done processing included files 12033 1726867217.94437: results queue empty 12033 1726867217.94437: checking for any_errors_fatal 12033 1726867217.94441: done checking for any_errors_fatal 12033 1726867217.94441: checking for max_fail_percentage 12033 1726867217.94442: done checking for max_fail_percentage 12033 1726867217.94443: checking to see if all hosts have failed and the running result is not ok 12033 1726867217.94444: done checking to see if all hosts have failed 12033 1726867217.94444: getting the remaining hosts for this loop 12033 1726867217.94445: done getting the remaining hosts for this loop 12033 1726867217.94448: getting the next task for host managed_node3 12033 1726867217.94452: done getting next task for host managed_node3 12033 1726867217.94454: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12033 1726867217.94458: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867217.94468: getting variables 12033 1726867217.94469: in VariableManager get_vars() 12033 1726867217.94487: Calling all_inventory to load vars for managed_node3 12033 1726867217.94490: Calling groups_inventory to load vars for managed_node3 12033 1726867217.94492: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867217.94497: Calling all_plugins_play to load vars for managed_node3 12033 1726867217.94500: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867217.94506: Calling groups_plugins_play to load vars for managed_node3 12033 1726867217.96851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867218.00079: done with get_vars() 12033 1726867218.00107: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:20:18 -0400 (0:00:00.143) 0:00:57.119 ****** 12033 1726867218.00390: entering _queue_task() for managed_node3/setup 12033 1726867218.00963: worker is 1 (out of 1 available) 12033 1726867218.00975: exiting _queue_task() for managed_node3/setup 12033 1726867218.01189: done queuing things up, now waiting for results queue to drain 12033 1726867218.01191: waiting for pending results... 12033 1726867218.01524: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12033 1726867218.01914: in run() - task 0affcac9-a3a5-74bb-502b-000000000fe0 12033 1726867218.02106: variable 'ansible_search_path' from source: unknown 12033 1726867218.02109: variable 'ansible_search_path' from source: unknown 12033 1726867218.02113: calling self._execute() 12033 1726867218.02284: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867218.02329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867218.02344: variable 'omit' from source: magic vars 12033 1726867218.03208: variable 'ansible_distribution_major_version' from source: facts 12033 1726867218.03249: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867218.03771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867218.08383: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867218.08463: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867218.08508: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867218.08782: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867218.08785: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867218.08825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867218.08860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867218.09183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867218.09186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867218.09192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867218.09220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867218.09247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867218.09274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867218.09317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867218.09582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867218.09654: variable '__network_required_facts' from source: role '' defaults 12033 1726867218.09982: variable 'ansible_facts' from source: unknown 12033 1726867218.11212: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12033 1726867218.11221: when evaluation is False, skipping this task 12033 1726867218.11229: _execute() done 12033 1726867218.11236: dumping result to json 12033 1726867218.11244: done dumping result, returning 12033 1726867218.11259: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-74bb-502b-000000000fe0] 12033 1726867218.11269: sending task result for task 0affcac9-a3a5-74bb-502b-000000000fe0 12033 1726867218.11390: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000fe0 12033 1726867218.11399: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867218.11526: no more pending results, returning what we have 12033 1726867218.11529: results queue empty 12033 1726867218.11530: checking for any_errors_fatal 12033 1726867218.11531: done checking for any_errors_fatal 12033 1726867218.11532: checking for max_fail_percentage 12033 1726867218.11534: done checking for max_fail_percentage 12033 1726867218.11535: checking to see if all hosts have failed and the running result is not ok 12033 1726867218.11535: done checking to see if all hosts have failed 12033 1726867218.11536: getting the remaining hosts for this loop 12033 1726867218.11538: done getting the remaining hosts for this loop 12033 1726867218.11542: getting the next task for host managed_node3 12033 1726867218.11552: done getting next task for host managed_node3 12033 1726867218.11555: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12033 1726867218.11561: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867218.11583: getting variables 12033 1726867218.11584: in VariableManager get_vars() 12033 1726867218.11632: Calling all_inventory to load vars for managed_node3 12033 1726867218.11635: Calling groups_inventory to load vars for managed_node3 12033 1726867218.11637: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867218.11647: Calling all_plugins_play to load vars for managed_node3 12033 1726867218.11650: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867218.11659: Calling groups_plugins_play to load vars for managed_node3 12033 1726867218.14694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867218.17900: done with get_vars() 12033 1726867218.17937: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:20:18 -0400 (0:00:00.176) 0:00:57.296 ****** 12033 1726867218.18046: entering _queue_task() for managed_node3/stat 12033 1726867218.18599: worker is 1 (out of 1 available) 12033 1726867218.18612: exiting _queue_task() for managed_node3/stat 12033 1726867218.18622: done queuing things up, now waiting for results queue to drain 12033 1726867218.18623: waiting for pending results... 12033 1726867218.18759: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 12033 1726867218.18940: in run() - task 0affcac9-a3a5-74bb-502b-000000000fe2 12033 1726867218.18968: variable 'ansible_search_path' from source: unknown 12033 1726867218.18979: variable 'ansible_search_path' from source: unknown 12033 1726867218.19023: calling self._execute() 12033 1726867218.19129: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867218.19142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867218.19156: variable 'omit' from source: magic vars 12033 1726867218.19560: variable 'ansible_distribution_major_version' from source: facts 12033 1726867218.19580: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867218.19775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867218.20068: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867218.20123: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867218.20165: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867218.20209: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867218.20334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867218.20360: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867218.20396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867218.20425: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867218.20529: variable '__network_is_ostree' from source: set_fact 12033 1726867218.20541: Evaluated conditional (not __network_is_ostree is defined): False 12033 1726867218.20549: when evaluation is False, skipping this task 12033 1726867218.20556: _execute() done 12033 1726867218.20564: dumping result to json 12033 1726867218.20571: done dumping result, returning 12033 1726867218.20595: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-74bb-502b-000000000fe2] 12033 1726867218.20682: sending task result for task 0affcac9-a3a5-74bb-502b-000000000fe2 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12033 1726867218.20809: no more pending results, returning what we have 12033 1726867218.20813: results queue empty 12033 1726867218.20814: checking for any_errors_fatal 12033 1726867218.20827: done checking for any_errors_fatal 12033 1726867218.20828: checking for max_fail_percentage 12033 1726867218.20830: done checking for max_fail_percentage 12033 1726867218.20831: checking to see if all hosts have failed and the running result is not ok 12033 1726867218.20831: done checking to see if all hosts have failed 12033 1726867218.20832: getting the remaining hosts for this loop 12033 1726867218.20835: done getting the remaining hosts for this loop 12033 1726867218.20839: getting the next task for host managed_node3 12033 1726867218.20848: done getting next task for host managed_node3 12033 1726867218.20851: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12033 1726867218.20858: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867218.20887: getting variables 12033 1726867218.20889: in VariableManager get_vars() 12033 1726867218.20938: Calling all_inventory to load vars for managed_node3 12033 1726867218.20941: Calling groups_inventory to load vars for managed_node3 12033 1726867218.20943: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867218.20954: Calling all_plugins_play to load vars for managed_node3 12033 1726867218.20957: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867218.20960: Calling groups_plugins_play to load vars for managed_node3 12033 1726867218.21484: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000fe2 12033 1726867218.21492: WORKER PROCESS EXITING 12033 1726867218.23290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867218.25151: done with get_vars() 12033 1726867218.25174: done getting variables 12033 1726867218.25247: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:20:18 -0400 (0:00:00.072) 0:00:57.369 ****** 12033 1726867218.25331: entering _queue_task() for managed_node3/set_fact 12033 1726867218.25700: worker is 1 (out of 1 available) 12033 1726867218.25716: exiting _queue_task() for managed_node3/set_fact 12033 1726867218.25728: done queuing things up, now waiting for results queue to drain 12033 1726867218.25730: waiting for pending results... 12033 1726867218.26062: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12033 1726867218.26257: in run() - task 0affcac9-a3a5-74bb-502b-000000000fe3 12033 1726867218.26284: variable 'ansible_search_path' from source: unknown 12033 1726867218.26383: variable 'ansible_search_path' from source: unknown 12033 1726867218.26387: calling self._execute() 12033 1726867218.26443: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867218.26455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867218.26469: variable 'omit' from source: magic vars 12033 1726867218.26884: variable 'ansible_distribution_major_version' from source: facts 12033 1726867218.26905: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867218.27158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867218.27407: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867218.27457: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867218.27505: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867218.27546: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867218.27684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867218.27725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867218.27757: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867218.27792: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867218.27897: variable '__network_is_ostree' from source: set_fact 12033 1726867218.27981: Evaluated conditional (not __network_is_ostree is defined): False 12033 1726867218.27986: when evaluation is False, skipping this task 12033 1726867218.27989: _execute() done 12033 1726867218.27991: dumping result to json 12033 1726867218.27993: done dumping result, returning 12033 1726867218.27996: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-74bb-502b-000000000fe3] 12033 1726867218.27999: sending task result for task 0affcac9-a3a5-74bb-502b-000000000fe3 12033 1726867218.28072: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000fe3 12033 1726867218.28076: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12033 1726867218.28132: no more pending results, returning what we have 12033 1726867218.28137: results queue empty 12033 1726867218.28138: checking for any_errors_fatal 12033 1726867218.28145: done checking for any_errors_fatal 12033 1726867218.28146: checking for max_fail_percentage 12033 1726867218.28148: done checking for max_fail_percentage 12033 1726867218.28149: checking to see if all hosts have failed and the running result is not ok 12033 1726867218.28150: done checking to see if all hosts have failed 12033 1726867218.28151: getting the remaining hosts for this loop 12033 1726867218.28153: done getting the remaining hosts for this loop 12033 1726867218.28157: getting the next task for host managed_node3 12033 1726867218.28168: done getting next task for host managed_node3 12033 1726867218.28172: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12033 1726867218.28181: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867218.28207: getting variables 12033 1726867218.28209: in VariableManager get_vars() 12033 1726867218.28259: Calling all_inventory to load vars for managed_node3 12033 1726867218.28261: Calling groups_inventory to load vars for managed_node3 12033 1726867218.28264: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867218.28273: Calling all_plugins_play to load vars for managed_node3 12033 1726867218.28276: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867218.28486: Calling groups_plugins_play to load vars for managed_node3 12033 1726867218.29930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867218.32417: done with get_vars() 12033 1726867218.32439: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:20:18 -0400 (0:00:00.072) 0:00:57.441 ****** 12033 1726867218.32541: entering _queue_task() for managed_node3/service_facts 12033 1726867218.32849: worker is 1 (out of 1 available) 12033 1726867218.32861: exiting _queue_task() for managed_node3/service_facts 12033 1726867218.32874: done queuing things up, now waiting for results queue to drain 12033 1726867218.32876: waiting for pending results... 12033 1726867218.33349: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 12033 1726867218.33354: in run() - task 0affcac9-a3a5-74bb-502b-000000000fe5 12033 1726867218.33374: variable 'ansible_search_path' from source: unknown 12033 1726867218.33384: variable 'ansible_search_path' from source: unknown 12033 1726867218.33446: calling self._execute() 12033 1726867218.33525: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867218.33536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867218.33556: variable 'omit' from source: magic vars 12033 1726867218.33987: variable 'ansible_distribution_major_version' from source: facts 12033 1726867218.33994: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867218.33997: variable 'omit' from source: magic vars 12033 1726867218.34078: variable 'omit' from source: magic vars 12033 1726867218.34127: variable 'omit' from source: magic vars 12033 1726867218.34203: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867218.34224: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867218.34249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867218.34271: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867218.34290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867218.34419: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867218.34422: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867218.34429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867218.34460: Set connection var ansible_pipelining to False 12033 1726867218.34480: Set connection var ansible_shell_executable to /bin/sh 12033 1726867218.34497: Set connection var ansible_timeout to 10 12033 1726867218.34509: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867218.34515: Set connection var ansible_connection to ssh 12033 1726867218.34528: Set connection var ansible_shell_type to sh 12033 1726867218.34555: variable 'ansible_shell_executable' from source: unknown 12033 1726867218.34562: variable 'ansible_connection' from source: unknown 12033 1726867218.34570: variable 'ansible_module_compression' from source: unknown 12033 1726867218.34579: variable 'ansible_shell_type' from source: unknown 12033 1726867218.34587: variable 'ansible_shell_executable' from source: unknown 12033 1726867218.34594: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867218.34604: variable 'ansible_pipelining' from source: unknown 12033 1726867218.34611: variable 'ansible_timeout' from source: unknown 12033 1726867218.34619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867218.34829: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867218.34962: variable 'omit' from source: magic vars 12033 1726867218.34967: starting attempt loop 12033 1726867218.34969: running the handler 12033 1726867218.34971: _low_level_execute_command(): starting 12033 1726867218.34972: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867218.36170: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867218.36222: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867218.36243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867218.36321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867218.36336: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867218.36355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867218.36382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867218.36398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867218.36489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867218.38425: stdout chunk (state=3): >>>/root <<< 12033 1726867218.38428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867218.38430: stdout chunk (state=3): >>><<< 12033 1726867218.38432: stderr chunk (state=3): >>><<< 12033 1726867218.38435: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867218.38438: _low_level_execute_command(): starting 12033 1726867218.38448: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867218.3839896-14767-185715235213896 `" && echo ansible-tmp-1726867218.3839896-14767-185715235213896="` echo /root/.ansible/tmp/ansible-tmp-1726867218.3839896-14767-185715235213896 `" ) && sleep 0' 12033 1726867218.39410: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867218.39418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867218.39425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867218.39441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867218.39452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867218.39500: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867218.39516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867218.39551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867218.39591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867218.39631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867218.41822: stdout chunk (state=3): >>>ansible-tmp-1726867218.3839896-14767-185715235213896=/root/.ansible/tmp/ansible-tmp-1726867218.3839896-14767-185715235213896 <<< 12033 1726867218.42301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867218.42306: stdout chunk (state=3): >>><<< 12033 1726867218.42310: stderr chunk (state=3): >>><<< 12033 1726867218.42313: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867218.3839896-14767-185715235213896=/root/.ansible/tmp/ansible-tmp-1726867218.3839896-14767-185715235213896 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867218.42316: variable 'ansible_module_compression' from source: unknown 12033 1726867218.42319: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 12033 1726867218.42322: variable 'ansible_facts' from source: unknown 12033 1726867218.42473: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867218.3839896-14767-185715235213896/AnsiballZ_service_facts.py 12033 1726867218.42863: Sending initial data 12033 1726867218.42965: Sent initial data (162 bytes) 12033 1726867218.44315: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867218.44393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867218.44474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867218.46018: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867218.46060: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867218.46129: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpg7zsocx1 /root/.ansible/tmp/ansible-tmp-1726867218.3839896-14767-185715235213896/AnsiballZ_service_facts.py <<< 12033 1726867218.46132: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867218.3839896-14767-185715235213896/AnsiballZ_service_facts.py" <<< 12033 1726867218.46200: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpg7zsocx1" to remote "/root/.ansible/tmp/ansible-tmp-1726867218.3839896-14767-185715235213896/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867218.3839896-14767-185715235213896/AnsiballZ_service_facts.py" <<< 12033 1726867218.47102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867218.47117: stderr chunk (state=3): >>><<< 12033 1726867218.47223: stdout chunk (state=3): >>><<< 12033 1726867218.47226: done transferring module to remote 12033 1726867218.47229: _low_level_execute_command(): starting 12033 1726867218.47231: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867218.3839896-14767-185715235213896/ /root/.ansible/tmp/ansible-tmp-1726867218.3839896-14767-185715235213896/AnsiballZ_service_facts.py && sleep 0' 12033 1726867218.47851: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867218.47865: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867218.47918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867218.47935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867218.48038: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867218.48052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867218.48090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867218.48135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867218.49941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867218.49964: stdout chunk (state=3): >>><<< 12033 1726867218.49968: stderr chunk (state=3): >>><<< 12033 1726867218.50059: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867218.50062: _low_level_execute_command(): starting 12033 1726867218.50065: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867218.3839896-14767-185715235213896/AnsiballZ_service_facts.py && sleep 0' 12033 1726867218.50661: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867218.50673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867218.50694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867218.50764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867218.50814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867218.50829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867218.50876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867218.50928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867220.04346: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 12033 1726867220.04353: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 12033 1726867220.04372: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 12033 1726867220.04392: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 12033 1726867220.04408: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integratio<<< 12033 1726867220.04430: stdout chunk (state=3): >>>n.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-<<< 12033 1726867220.04435: stdout chunk (state=3): >>>boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state<<< 12033 1726867220.04449: stdout chunk (state=3): >>>": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12033 1726867220.05997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867220.06000: stdout chunk (state=3): >>><<< 12033 1726867220.06002: stderr chunk (state=3): >>><<< 12033 1726867220.06049: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867220.06761: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867218.3839896-14767-185715235213896/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867220.06768: _low_level_execute_command(): starting 12033 1726867220.06773: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867218.3839896-14767-185715235213896/ > /dev/null 2>&1 && sleep 0' 12033 1726867220.07187: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867220.07190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867220.07193: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867220.07195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867220.07248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867220.07251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867220.07300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867220.09115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867220.09134: stderr chunk (state=3): >>><<< 12033 1726867220.09137: stdout chunk (state=3): >>><<< 12033 1726867220.09154: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867220.09158: handler run complete 12033 1726867220.09264: variable 'ansible_facts' from source: unknown 12033 1726867220.09347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867220.09621: variable 'ansible_facts' from source: unknown 12033 1726867220.09701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867220.09816: attempt loop complete, returning result 12033 1726867220.09820: _execute() done 12033 1726867220.09824: dumping result to json 12033 1726867220.09859: done dumping result, returning 12033 1726867220.09866: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-74bb-502b-000000000fe5] 12033 1726867220.09870: sending task result for task 0affcac9-a3a5-74bb-502b-000000000fe5 12033 1726867220.10595: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000fe5 12033 1726867220.10602: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867220.10651: no more pending results, returning what we have 12033 1726867220.10653: results queue empty 12033 1726867220.10654: checking for any_errors_fatal 12033 1726867220.10656: done checking for any_errors_fatal 12033 1726867220.10657: checking for max_fail_percentage 12033 1726867220.10658: done checking for max_fail_percentage 12033 1726867220.10659: checking to see if all hosts have failed and the running result is not ok 12033 1726867220.10659: done checking to see if all hosts have failed 12033 1726867220.10659: getting the remaining hosts for this loop 12033 1726867220.10660: done getting the remaining hosts for this loop 12033 1726867220.10663: getting the next task for host managed_node3 12033 1726867220.10667: done getting next task for host managed_node3 12033 1726867220.10669: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12033 1726867220.10674: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867220.10684: getting variables 12033 1726867220.10685: in VariableManager get_vars() 12033 1726867220.10713: Calling all_inventory to load vars for managed_node3 12033 1726867220.10715: Calling groups_inventory to load vars for managed_node3 12033 1726867220.10717: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867220.10723: Calling all_plugins_play to load vars for managed_node3 12033 1726867220.10725: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867220.10727: Calling groups_plugins_play to load vars for managed_node3 12033 1726867220.11403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867220.12269: done with get_vars() 12033 1726867220.12286: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:20:20 -0400 (0:00:01.798) 0:00:59.239 ****** 12033 1726867220.12358: entering _queue_task() for managed_node3/package_facts 12033 1726867220.12571: worker is 1 (out of 1 available) 12033 1726867220.12589: exiting _queue_task() for managed_node3/package_facts 12033 1726867220.12604: done queuing things up, now waiting for results queue to drain 12033 1726867220.12605: waiting for pending results... 12033 1726867220.12780: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 12033 1726867220.12886: in run() - task 0affcac9-a3a5-74bb-502b-000000000fe6 12033 1726867220.12898: variable 'ansible_search_path' from source: unknown 12033 1726867220.12904: variable 'ansible_search_path' from source: unknown 12033 1726867220.12929: calling self._execute() 12033 1726867220.13007: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867220.13011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867220.13018: variable 'omit' from source: magic vars 12033 1726867220.13298: variable 'ansible_distribution_major_version' from source: facts 12033 1726867220.13308: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867220.13320: variable 'omit' from source: magic vars 12033 1726867220.13382: variable 'omit' from source: magic vars 12033 1726867220.13404: variable 'omit' from source: magic vars 12033 1726867220.13433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867220.13460: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867220.13474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867220.13492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867220.13504: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867220.13525: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867220.13528: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867220.13531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867220.13599: Set connection var ansible_pipelining to False 12033 1726867220.13607: Set connection var ansible_shell_executable to /bin/sh 12033 1726867220.13614: Set connection var ansible_timeout to 10 12033 1726867220.13619: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867220.13621: Set connection var ansible_connection to ssh 12033 1726867220.13626: Set connection var ansible_shell_type to sh 12033 1726867220.13642: variable 'ansible_shell_executable' from source: unknown 12033 1726867220.13645: variable 'ansible_connection' from source: unknown 12033 1726867220.13649: variable 'ansible_module_compression' from source: unknown 12033 1726867220.13651: variable 'ansible_shell_type' from source: unknown 12033 1726867220.13654: variable 'ansible_shell_executable' from source: unknown 12033 1726867220.13656: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867220.13658: variable 'ansible_pipelining' from source: unknown 12033 1726867220.13661: variable 'ansible_timeout' from source: unknown 12033 1726867220.13665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867220.13812: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867220.13817: variable 'omit' from source: magic vars 12033 1726867220.13819: starting attempt loop 12033 1726867220.13822: running the handler 12033 1726867220.13837: _low_level_execute_command(): starting 12033 1726867220.13843: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867220.14338: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867220.14341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867220.14344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867220.14346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867220.14400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867220.14404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867220.14406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867220.14457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867220.16063: stdout chunk (state=3): >>>/root <<< 12033 1726867220.16161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867220.16187: stderr chunk (state=3): >>><<< 12033 1726867220.16190: stdout chunk (state=3): >>><<< 12033 1726867220.16212: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867220.16223: _low_level_execute_command(): starting 12033 1726867220.16228: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867220.1621077-14835-161886958283783 `" && echo ansible-tmp-1726867220.1621077-14835-161886958283783="` echo /root/.ansible/tmp/ansible-tmp-1726867220.1621077-14835-161886958283783 `" ) && sleep 0' 12033 1726867220.16637: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867220.16640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867220.16648: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867220.16650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867220.16695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867220.16698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867220.16747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867220.18622: stdout chunk (state=3): >>>ansible-tmp-1726867220.1621077-14835-161886958283783=/root/.ansible/tmp/ansible-tmp-1726867220.1621077-14835-161886958283783 <<< 12033 1726867220.18731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867220.18755: stderr chunk (state=3): >>><<< 12033 1726867220.18758: stdout chunk (state=3): >>><<< 12033 1726867220.18772: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867220.1621077-14835-161886958283783=/root/.ansible/tmp/ansible-tmp-1726867220.1621077-14835-161886958283783 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867220.18812: variable 'ansible_module_compression' from source: unknown 12033 1726867220.18847: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 12033 1726867220.18895: variable 'ansible_facts' from source: unknown 12033 1726867220.19019: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867220.1621077-14835-161886958283783/AnsiballZ_package_facts.py 12033 1726867220.19111: Sending initial data 12033 1726867220.19114: Sent initial data (162 bytes) 12033 1726867220.19543: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867220.19546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867220.19549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867220.19553: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867220.19555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867220.19607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867220.19614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867220.19659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867220.21213: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12033 1726867220.21217: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867220.21256: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867220.21302: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpk_gb8pjr /root/.ansible/tmp/ansible-tmp-1726867220.1621077-14835-161886958283783/AnsiballZ_package_facts.py <<< 12033 1726867220.21310: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867220.1621077-14835-161886958283783/AnsiballZ_package_facts.py" <<< 12033 1726867220.21345: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpk_gb8pjr" to remote "/root/.ansible/tmp/ansible-tmp-1726867220.1621077-14835-161886958283783/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867220.1621077-14835-161886958283783/AnsiballZ_package_facts.py" <<< 12033 1726867220.22398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867220.22432: stderr chunk (state=3): >>><<< 12033 1726867220.22435: stdout chunk (state=3): >>><<< 12033 1726867220.22459: done transferring module to remote 12033 1726867220.22469: _low_level_execute_command(): starting 12033 1726867220.22479: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867220.1621077-14835-161886958283783/ /root/.ansible/tmp/ansible-tmp-1726867220.1621077-14835-161886958283783/AnsiballZ_package_facts.py && sleep 0' 12033 1726867220.22870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867220.22874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867220.22886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867220.22945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867220.22948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867220.22997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867220.24763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867220.24785: stderr chunk (state=3): >>><<< 12033 1726867220.24789: stdout chunk (state=3): >>><<< 12033 1726867220.24800: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867220.24803: _low_level_execute_command(): starting 12033 1726867220.24810: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867220.1621077-14835-161886958283783/AnsiballZ_package_facts.py && sleep 0' 12033 1726867220.25216: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867220.25219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867220.25222: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867220.25224: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867220.25226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867220.25270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867220.25273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867220.25325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867220.70092: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 12033 1726867220.70106: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 12033 1726867220.70125: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 12033 1726867220.70161: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 12033 1726867220.70166: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 12033 1726867220.70179: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 12033 1726867220.70213: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 12033 1726867220.70253: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 12033 1726867220.70257: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 12033 1726867220.70279: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 12033 1726867220.70291: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 12033 1726867220.70310: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12033 1726867220.72031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867220.72055: stderr chunk (state=3): >>><<< 12033 1726867220.72058: stdout chunk (state=3): >>><<< 12033 1726867220.72095: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867220.73403: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867220.1621077-14835-161886958283783/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867220.73417: _low_level_execute_command(): starting 12033 1726867220.73421: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867220.1621077-14835-161886958283783/ > /dev/null 2>&1 && sleep 0' 12033 1726867220.73848: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867220.73851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867220.73854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867220.73856: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867220.73858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867220.73900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867220.73914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867220.73964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867220.75803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867220.75825: stderr chunk (state=3): >>><<< 12033 1726867220.75829: stdout chunk (state=3): >>><<< 12033 1726867220.75841: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867220.75847: handler run complete 12033 1726867220.76289: variable 'ansible_facts' from source: unknown 12033 1726867220.76604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867220.77631: variable 'ansible_facts' from source: unknown 12033 1726867220.77867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867220.78249: attempt loop complete, returning result 12033 1726867220.78258: _execute() done 12033 1726867220.78261: dumping result to json 12033 1726867220.78374: done dumping result, returning 12033 1726867220.78383: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-74bb-502b-000000000fe6] 12033 1726867220.78388: sending task result for task 0affcac9-a3a5-74bb-502b-000000000fe6 12033 1726867220.79647: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000fe6 12033 1726867220.79650: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867220.79737: no more pending results, returning what we have 12033 1726867220.79740: results queue empty 12033 1726867220.79740: checking for any_errors_fatal 12033 1726867220.79744: done checking for any_errors_fatal 12033 1726867220.79744: checking for max_fail_percentage 12033 1726867220.79745: done checking for max_fail_percentage 12033 1726867220.79746: checking to see if all hosts have failed and the running result is not ok 12033 1726867220.79746: done checking to see if all hosts have failed 12033 1726867220.79747: getting the remaining hosts for this loop 12033 1726867220.79748: done getting the remaining hosts for this loop 12033 1726867220.79750: getting the next task for host managed_node3 12033 1726867220.79755: done getting next task for host managed_node3 12033 1726867220.79757: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12033 1726867220.79761: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867220.79769: getting variables 12033 1726867220.79770: in VariableManager get_vars() 12033 1726867220.79797: Calling all_inventory to load vars for managed_node3 12033 1726867220.79799: Calling groups_inventory to load vars for managed_node3 12033 1726867220.79800: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867220.79811: Calling all_plugins_play to load vars for managed_node3 12033 1726867220.79814: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867220.79816: Calling groups_plugins_play to load vars for managed_node3 12033 1726867220.80479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867220.81350: done with get_vars() 12033 1726867220.81367: done getting variables 12033 1726867220.81414: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:20:20 -0400 (0:00:00.690) 0:00:59.930 ****** 12033 1726867220.81438: entering _queue_task() for managed_node3/debug 12033 1726867220.81655: worker is 1 (out of 1 available) 12033 1726867220.81670: exiting _queue_task() for managed_node3/debug 12033 1726867220.81685: done queuing things up, now waiting for results queue to drain 12033 1726867220.81687: waiting for pending results... 12033 1726867220.81874: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 12033 1726867220.81967: in run() - task 0affcac9-a3a5-74bb-502b-000000000e0d 12033 1726867220.81981: variable 'ansible_search_path' from source: unknown 12033 1726867220.81984: variable 'ansible_search_path' from source: unknown 12033 1726867220.82013: calling self._execute() 12033 1726867220.82086: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867220.82090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867220.82098: variable 'omit' from source: magic vars 12033 1726867220.82388: variable 'ansible_distribution_major_version' from source: facts 12033 1726867220.82398: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867220.82403: variable 'omit' from source: magic vars 12033 1726867220.82456: variable 'omit' from source: magic vars 12033 1726867220.82529: variable 'network_provider' from source: set_fact 12033 1726867220.82542: variable 'omit' from source: magic vars 12033 1726867220.82574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867220.82602: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867220.82620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867220.82632: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867220.82642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867220.82666: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867220.82669: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867220.82672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867220.82745: Set connection var ansible_pipelining to False 12033 1726867220.82752: Set connection var ansible_shell_executable to /bin/sh 12033 1726867220.82758: Set connection var ansible_timeout to 10 12033 1726867220.82764: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867220.82766: Set connection var ansible_connection to ssh 12033 1726867220.82771: Set connection var ansible_shell_type to sh 12033 1726867220.82791: variable 'ansible_shell_executable' from source: unknown 12033 1726867220.82796: variable 'ansible_connection' from source: unknown 12033 1726867220.82799: variable 'ansible_module_compression' from source: unknown 12033 1726867220.82801: variable 'ansible_shell_type' from source: unknown 12033 1726867220.82803: variable 'ansible_shell_executable' from source: unknown 12033 1726867220.82805: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867220.82807: variable 'ansible_pipelining' from source: unknown 12033 1726867220.82809: variable 'ansible_timeout' from source: unknown 12033 1726867220.82814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867220.82915: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867220.82926: variable 'omit' from source: magic vars 12033 1726867220.82930: starting attempt loop 12033 1726867220.82933: running the handler 12033 1726867220.82969: handler run complete 12033 1726867220.82985: attempt loop complete, returning result 12033 1726867220.82988: _execute() done 12033 1726867220.82991: dumping result to json 12033 1726867220.82993: done dumping result, returning 12033 1726867220.82995: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-74bb-502b-000000000e0d] 12033 1726867220.83003: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e0d 12033 1726867220.83087: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e0d 12033 1726867220.83090: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 12033 1726867220.83171: no more pending results, returning what we have 12033 1726867220.83174: results queue empty 12033 1726867220.83175: checking for any_errors_fatal 12033 1726867220.83184: done checking for any_errors_fatal 12033 1726867220.83184: checking for max_fail_percentage 12033 1726867220.83186: done checking for max_fail_percentage 12033 1726867220.83187: checking to see if all hosts have failed and the running result is not ok 12033 1726867220.83187: done checking to see if all hosts have failed 12033 1726867220.83188: getting the remaining hosts for this loop 12033 1726867220.83189: done getting the remaining hosts for this loop 12033 1726867220.83192: getting the next task for host managed_node3 12033 1726867220.83199: done getting next task for host managed_node3 12033 1726867220.83203: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12033 1726867220.83207: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867220.83218: getting variables 12033 1726867220.83220: in VariableManager get_vars() 12033 1726867220.83257: Calling all_inventory to load vars for managed_node3 12033 1726867220.83259: Calling groups_inventory to load vars for managed_node3 12033 1726867220.83261: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867220.83269: Calling all_plugins_play to load vars for managed_node3 12033 1726867220.83271: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867220.83274: Calling groups_plugins_play to load vars for managed_node3 12033 1726867220.84092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867220.84987: done with get_vars() 12033 1726867220.85004: done getting variables 12033 1726867220.85046: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:20:20 -0400 (0:00:00.036) 0:00:59.966 ****** 12033 1726867220.85073: entering _queue_task() for managed_node3/fail 12033 1726867220.85280: worker is 1 (out of 1 available) 12033 1726867220.85294: exiting _queue_task() for managed_node3/fail 12033 1726867220.85307: done queuing things up, now waiting for results queue to drain 12033 1726867220.85308: waiting for pending results... 12033 1726867220.85486: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12033 1726867220.85578: in run() - task 0affcac9-a3a5-74bb-502b-000000000e0e 12033 1726867220.85589: variable 'ansible_search_path' from source: unknown 12033 1726867220.85593: variable 'ansible_search_path' from source: unknown 12033 1726867220.85621: calling self._execute() 12033 1726867220.85721: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867220.85724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867220.85728: variable 'omit' from source: magic vars 12033 1726867220.86113: variable 'ansible_distribution_major_version' from source: facts 12033 1726867220.86117: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867220.86208: variable 'network_state' from source: role '' defaults 12033 1726867220.86211: Evaluated conditional (network_state != {}): False 12033 1726867220.86214: when evaluation is False, skipping this task 12033 1726867220.86217: _execute() done 12033 1726867220.86219: dumping result to json 12033 1726867220.86221: done dumping result, returning 12033 1726867220.86228: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-74bb-502b-000000000e0e] 12033 1726867220.86233: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e0e 12033 1726867220.86328: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e0e 12033 1726867220.86330: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867220.86376: no more pending results, returning what we have 12033 1726867220.86382: results queue empty 12033 1726867220.86383: checking for any_errors_fatal 12033 1726867220.86389: done checking for any_errors_fatal 12033 1726867220.86390: checking for max_fail_percentage 12033 1726867220.86392: done checking for max_fail_percentage 12033 1726867220.86393: checking to see if all hosts have failed and the running result is not ok 12033 1726867220.86393: done checking to see if all hosts have failed 12033 1726867220.86394: getting the remaining hosts for this loop 12033 1726867220.86396: done getting the remaining hosts for this loop 12033 1726867220.86399: getting the next task for host managed_node3 12033 1726867220.86409: done getting next task for host managed_node3 12033 1726867220.86412: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12033 1726867220.86416: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867220.86432: getting variables 12033 1726867220.86433: in VariableManager get_vars() 12033 1726867220.86469: Calling all_inventory to load vars for managed_node3 12033 1726867220.86472: Calling groups_inventory to load vars for managed_node3 12033 1726867220.86474: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867220.86483: Calling all_plugins_play to load vars for managed_node3 12033 1726867220.86486: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867220.86488: Calling groups_plugins_play to load vars for managed_node3 12033 1726867220.87473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867220.88334: done with get_vars() 12033 1726867220.88351: done getting variables 12033 1726867220.88391: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:20:20 -0400 (0:00:00.033) 0:00:59.999 ****** 12033 1726867220.88415: entering _queue_task() for managed_node3/fail 12033 1726867220.88604: worker is 1 (out of 1 available) 12033 1726867220.88618: exiting _queue_task() for managed_node3/fail 12033 1726867220.88630: done queuing things up, now waiting for results queue to drain 12033 1726867220.88631: waiting for pending results... 12033 1726867220.88815: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12033 1726867220.89186: in run() - task 0affcac9-a3a5-74bb-502b-000000000e0f 12033 1726867220.89190: variable 'ansible_search_path' from source: unknown 12033 1726867220.89193: variable 'ansible_search_path' from source: unknown 12033 1726867220.89196: calling self._execute() 12033 1726867220.89199: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867220.89202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867220.89204: variable 'omit' from source: magic vars 12033 1726867220.89459: variable 'ansible_distribution_major_version' from source: facts 12033 1726867220.89482: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867220.89787: variable 'network_state' from source: role '' defaults 12033 1726867220.89790: Evaluated conditional (network_state != {}): False 12033 1726867220.89793: when evaluation is False, skipping this task 12033 1726867220.89795: _execute() done 12033 1726867220.89797: dumping result to json 12033 1726867220.89799: done dumping result, returning 12033 1726867220.89801: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-74bb-502b-000000000e0f] 12033 1726867220.89803: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e0f 12033 1726867220.89861: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e0f 12033 1726867220.89863: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867220.89921: no more pending results, returning what we have 12033 1726867220.89925: results queue empty 12033 1726867220.89925: checking for any_errors_fatal 12033 1726867220.89932: done checking for any_errors_fatal 12033 1726867220.89933: checking for max_fail_percentage 12033 1726867220.89935: done checking for max_fail_percentage 12033 1726867220.89936: checking to see if all hosts have failed and the running result is not ok 12033 1726867220.89936: done checking to see if all hosts have failed 12033 1726867220.89937: getting the remaining hosts for this loop 12033 1726867220.89938: done getting the remaining hosts for this loop 12033 1726867220.89941: getting the next task for host managed_node3 12033 1726867220.89948: done getting next task for host managed_node3 12033 1726867220.89951: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12033 1726867220.89956: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867220.89973: getting variables 12033 1726867220.89974: in VariableManager get_vars() 12033 1726867220.90014: Calling all_inventory to load vars for managed_node3 12033 1726867220.90016: Calling groups_inventory to load vars for managed_node3 12033 1726867220.90019: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867220.90027: Calling all_plugins_play to load vars for managed_node3 12033 1726867220.90029: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867220.90032: Calling groups_plugins_play to load vars for managed_node3 12033 1726867220.91611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867220.93235: done with get_vars() 12033 1726867220.93256: done getting variables 12033 1726867220.93315: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:20:20 -0400 (0:00:00.049) 0:01:00.049 ****** 12033 1726867220.93354: entering _queue_task() for managed_node3/fail 12033 1726867220.93630: worker is 1 (out of 1 available) 12033 1726867220.93643: exiting _queue_task() for managed_node3/fail 12033 1726867220.93767: done queuing things up, now waiting for results queue to drain 12033 1726867220.93769: waiting for pending results... 12033 1726867220.93951: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12033 1726867220.94120: in run() - task 0affcac9-a3a5-74bb-502b-000000000e10 12033 1726867220.94139: variable 'ansible_search_path' from source: unknown 12033 1726867220.94146: variable 'ansible_search_path' from source: unknown 12033 1726867220.94185: calling self._execute() 12033 1726867220.94286: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867220.94298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867220.94318: variable 'omit' from source: magic vars 12033 1726867220.94687: variable 'ansible_distribution_major_version' from source: facts 12033 1726867220.94704: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867220.94892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867220.97094: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867220.97172: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867220.97251: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867220.97264: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867220.97297: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867220.97385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867220.97422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867220.97467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867220.97682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867220.97685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867220.97687: variable 'ansible_distribution_major_version' from source: facts 12033 1726867220.97689: Evaluated conditional (ansible_distribution_major_version | int > 9): True 12033 1726867220.97747: variable 'ansible_distribution' from source: facts 12033 1726867220.97755: variable '__network_rh_distros' from source: role '' defaults 12033 1726867220.97768: Evaluated conditional (ansible_distribution in __network_rh_distros): True 12033 1726867220.98022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867220.98051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867220.98080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867220.98121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867220.98148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867220.98203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867220.98236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867220.98268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867220.98311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867220.98329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867220.98382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867220.98411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867220.98458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867220.98489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867220.98509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867220.98894: variable 'network_connections' from source: task vars 12033 1726867220.98898: variable 'port2_profile' from source: play vars 12033 1726867220.98936: variable 'port2_profile' from source: play vars 12033 1726867220.98951: variable 'port1_profile' from source: play vars 12033 1726867220.99021: variable 'port1_profile' from source: play vars 12033 1726867220.99034: variable 'controller_profile' from source: play vars 12033 1726867220.99097: variable 'controller_profile' from source: play vars 12033 1726867220.99119: variable 'network_state' from source: role '' defaults 12033 1726867220.99221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867220.99361: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867220.99404: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867220.99444: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867220.99479: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867220.99547: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867220.99562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867220.99656: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867220.99659: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867220.99662: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 12033 1726867220.99664: when evaluation is False, skipping this task 12033 1726867220.99666: _execute() done 12033 1726867220.99668: dumping result to json 12033 1726867220.99674: done dumping result, returning 12033 1726867220.99686: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-74bb-502b-000000000e10] 12033 1726867220.99696: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e10 skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 12033 1726867220.99916: no more pending results, returning what we have 12033 1726867220.99920: results queue empty 12033 1726867220.99923: checking for any_errors_fatal 12033 1726867220.99931: done checking for any_errors_fatal 12033 1726867220.99932: checking for max_fail_percentage 12033 1726867220.99935: done checking for max_fail_percentage 12033 1726867220.99936: checking to see if all hosts have failed and the running result is not ok 12033 1726867220.99936: done checking to see if all hosts have failed 12033 1726867220.99937: getting the remaining hosts for this loop 12033 1726867220.99939: done getting the remaining hosts for this loop 12033 1726867220.99942: getting the next task for host managed_node3 12033 1726867220.99950: done getting next task for host managed_node3 12033 1726867220.99954: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12033 1726867220.99958: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867220.99983: getting variables 12033 1726867220.99985: in VariableManager get_vars() 12033 1726867221.00030: Calling all_inventory to load vars for managed_node3 12033 1726867221.00033: Calling groups_inventory to load vars for managed_node3 12033 1726867221.00035: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867221.00046: Calling all_plugins_play to load vars for managed_node3 12033 1726867221.00049: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867221.00052: Calling groups_plugins_play to load vars for managed_node3 12033 1726867221.00692: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e10 12033 1726867221.00696: WORKER PROCESS EXITING 12033 1726867221.01648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867221.03287: done with get_vars() 12033 1726867221.03311: done getting variables 12033 1726867221.03374: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:20:21 -0400 (0:00:00.100) 0:01:00.149 ****** 12033 1726867221.03409: entering _queue_task() for managed_node3/dnf 12033 1726867221.03762: worker is 1 (out of 1 available) 12033 1726867221.03888: exiting _queue_task() for managed_node3/dnf 12033 1726867221.03899: done queuing things up, now waiting for results queue to drain 12033 1726867221.03901: waiting for pending results... 12033 1726867221.04098: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12033 1726867221.04262: in run() - task 0affcac9-a3a5-74bb-502b-000000000e11 12033 1726867221.04283: variable 'ansible_search_path' from source: unknown 12033 1726867221.04291: variable 'ansible_search_path' from source: unknown 12033 1726867221.04380: calling self._execute() 12033 1726867221.04436: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867221.04448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867221.04461: variable 'omit' from source: magic vars 12033 1726867221.04855: variable 'ansible_distribution_major_version' from source: facts 12033 1726867221.04879: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867221.05096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867221.07813: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867221.07818: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867221.07840: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867221.07880: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867221.07904: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867221.07990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.08025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.08043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.08086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.08138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.08209: variable 'ansible_distribution' from source: facts 12033 1726867221.08212: variable 'ansible_distribution_major_version' from source: facts 12033 1726867221.08228: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12033 1726867221.08337: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867221.08463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.08487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.08516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.08551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.08674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.08681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.08684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.08686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.08689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.08691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.08732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.08751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.08771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.08811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.08830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.08990: variable 'network_connections' from source: task vars 12033 1726867221.09001: variable 'port2_profile' from source: play vars 12033 1726867221.09067: variable 'port2_profile' from source: play vars 12033 1726867221.09082: variable 'port1_profile' from source: play vars 12033 1726867221.09242: variable 'port1_profile' from source: play vars 12033 1726867221.09245: variable 'controller_profile' from source: play vars 12033 1726867221.09248: variable 'controller_profile' from source: play vars 12033 1726867221.09286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867221.09452: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867221.09492: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867221.09546: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867221.09580: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867221.09610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867221.09681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867221.09687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.09690: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867221.09733: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867221.09966: variable 'network_connections' from source: task vars 12033 1726867221.09970: variable 'port2_profile' from source: play vars 12033 1726867221.10036: variable 'port2_profile' from source: play vars 12033 1726867221.10110: variable 'port1_profile' from source: play vars 12033 1726867221.10114: variable 'port1_profile' from source: play vars 12033 1726867221.10116: variable 'controller_profile' from source: play vars 12033 1726867221.10174: variable 'controller_profile' from source: play vars 12033 1726867221.10198: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12033 1726867221.10201: when evaluation is False, skipping this task 12033 1726867221.10204: _execute() done 12033 1726867221.10231: dumping result to json 12033 1726867221.10234: done dumping result, returning 12033 1726867221.10236: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-74bb-502b-000000000e11] 12033 1726867221.10238: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e11 12033 1726867221.10394: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e11 12033 1726867221.10397: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12033 1726867221.10446: no more pending results, returning what we have 12033 1726867221.10450: results queue empty 12033 1726867221.10451: checking for any_errors_fatal 12033 1726867221.10457: done checking for any_errors_fatal 12033 1726867221.10458: checking for max_fail_percentage 12033 1726867221.10460: done checking for max_fail_percentage 12033 1726867221.10461: checking to see if all hosts have failed and the running result is not ok 12033 1726867221.10462: done checking to see if all hosts have failed 12033 1726867221.10462: getting the remaining hosts for this loop 12033 1726867221.10464: done getting the remaining hosts for this loop 12033 1726867221.10468: getting the next task for host managed_node3 12033 1726867221.10475: done getting next task for host managed_node3 12033 1726867221.10479: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12033 1726867221.10485: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867221.10504: getting variables 12033 1726867221.10505: in VariableManager get_vars() 12033 1726867221.10545: Calling all_inventory to load vars for managed_node3 12033 1726867221.10548: Calling groups_inventory to load vars for managed_node3 12033 1726867221.10550: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867221.10558: Calling all_plugins_play to load vars for managed_node3 12033 1726867221.10560: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867221.10562: Calling groups_plugins_play to load vars for managed_node3 12033 1726867221.12218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867221.13839: done with get_vars() 12033 1726867221.13859: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12033 1726867221.13932: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:20:21 -0400 (0:00:00.105) 0:01:00.255 ****** 12033 1726867221.13964: entering _queue_task() for managed_node3/yum 12033 1726867221.14260: worker is 1 (out of 1 available) 12033 1726867221.14271: exiting _queue_task() for managed_node3/yum 12033 1726867221.14284: done queuing things up, now waiting for results queue to drain 12033 1726867221.14286: waiting for pending results... 12033 1726867221.14697: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12033 1726867221.14734: in run() - task 0affcac9-a3a5-74bb-502b-000000000e12 12033 1726867221.14784: variable 'ansible_search_path' from source: unknown 12033 1726867221.14793: variable 'ansible_search_path' from source: unknown 12033 1726867221.14814: calling self._execute() 12033 1726867221.14904: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867221.14924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867221.15012: variable 'omit' from source: magic vars 12033 1726867221.15362: variable 'ansible_distribution_major_version' from source: facts 12033 1726867221.15383: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867221.15569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867221.18060: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867221.18134: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867221.18175: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867221.18219: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867221.18247: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867221.18482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.18485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.18488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.18490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.18492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.18544: variable 'ansible_distribution_major_version' from source: facts 12033 1726867221.18563: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12033 1726867221.18570: when evaluation is False, skipping this task 12033 1726867221.18580: _execute() done 12033 1726867221.18588: dumping result to json 12033 1726867221.18595: done dumping result, returning 12033 1726867221.18610: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-74bb-502b-000000000e12] 12033 1726867221.18620: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e12 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12033 1726867221.18779: no more pending results, returning what we have 12033 1726867221.18782: results queue empty 12033 1726867221.18784: checking for any_errors_fatal 12033 1726867221.18791: done checking for any_errors_fatal 12033 1726867221.18791: checking for max_fail_percentage 12033 1726867221.18793: done checking for max_fail_percentage 12033 1726867221.18794: checking to see if all hosts have failed and the running result is not ok 12033 1726867221.18795: done checking to see if all hosts have failed 12033 1726867221.18795: getting the remaining hosts for this loop 12033 1726867221.18798: done getting the remaining hosts for this loop 12033 1726867221.18802: getting the next task for host managed_node3 12033 1726867221.18810: done getting next task for host managed_node3 12033 1726867221.18814: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12033 1726867221.18819: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867221.18841: getting variables 12033 1726867221.18842: in VariableManager get_vars() 12033 1726867221.18889: Calling all_inventory to load vars for managed_node3 12033 1726867221.18892: Calling groups_inventory to load vars for managed_node3 12033 1726867221.18894: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867221.18904: Calling all_plugins_play to load vars for managed_node3 12033 1726867221.18907: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867221.18909: Calling groups_plugins_play to load vars for managed_node3 12033 1726867221.19492: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e12 12033 1726867221.19495: WORKER PROCESS EXITING 12033 1726867221.20321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867221.22018: done with get_vars() 12033 1726867221.22042: done getting variables 12033 1726867221.22109: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:20:21 -0400 (0:00:00.081) 0:01:00.337 ****** 12033 1726867221.22148: entering _queue_task() for managed_node3/fail 12033 1726867221.22511: worker is 1 (out of 1 available) 12033 1726867221.22525: exiting _queue_task() for managed_node3/fail 12033 1726867221.22539: done queuing things up, now waiting for results queue to drain 12033 1726867221.22541: waiting for pending results... 12033 1726867221.22845: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12033 1726867221.23030: in run() - task 0affcac9-a3a5-74bb-502b-000000000e13 12033 1726867221.23052: variable 'ansible_search_path' from source: unknown 12033 1726867221.23062: variable 'ansible_search_path' from source: unknown 12033 1726867221.23108: calling self._execute() 12033 1726867221.23212: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867221.23230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867221.23245: variable 'omit' from source: magic vars 12033 1726867221.23617: variable 'ansible_distribution_major_version' from source: facts 12033 1726867221.23636: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867221.23769: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867221.23959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867221.26221: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867221.26293: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867221.26335: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867221.26379: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867221.26414: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867221.26508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.26783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.26787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.26789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.26792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.26794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.26796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.26798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.26800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.26805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.26831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.26860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.26890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.26940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.26957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.27145: variable 'network_connections' from source: task vars 12033 1726867221.27162: variable 'port2_profile' from source: play vars 12033 1726867221.27235: variable 'port2_profile' from source: play vars 12033 1726867221.27258: variable 'port1_profile' from source: play vars 12033 1726867221.27322: variable 'port1_profile' from source: play vars 12033 1726867221.27334: variable 'controller_profile' from source: play vars 12033 1726867221.27392: variable 'controller_profile' from source: play vars 12033 1726867221.27464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867221.27690: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867221.27728: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867221.27759: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867221.27798: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867221.27840: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867221.27861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867221.27906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.27922: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867221.27974: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867221.28236: variable 'network_connections' from source: task vars 12033 1726867221.28240: variable 'port2_profile' from source: play vars 12033 1726867221.28304: variable 'port2_profile' from source: play vars 12033 1726867221.28308: variable 'port1_profile' from source: play vars 12033 1726867221.28373: variable 'port1_profile' from source: play vars 12033 1726867221.28382: variable 'controller_profile' from source: play vars 12033 1726867221.28448: variable 'controller_profile' from source: play vars 12033 1726867221.28474: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12033 1726867221.28487: when evaluation is False, skipping this task 12033 1726867221.28490: _execute() done 12033 1726867221.28493: dumping result to json 12033 1726867221.28495: done dumping result, returning 12033 1726867221.28497: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-74bb-502b-000000000e13] 12033 1726867221.28499: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e13 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12033 1726867221.28729: no more pending results, returning what we have 12033 1726867221.28732: results queue empty 12033 1726867221.28733: checking for any_errors_fatal 12033 1726867221.28740: done checking for any_errors_fatal 12033 1726867221.28741: checking for max_fail_percentage 12033 1726867221.28743: done checking for max_fail_percentage 12033 1726867221.28744: checking to see if all hosts have failed and the running result is not ok 12033 1726867221.28744: done checking to see if all hosts have failed 12033 1726867221.28745: getting the remaining hosts for this loop 12033 1726867221.28747: done getting the remaining hosts for this loop 12033 1726867221.28750: getting the next task for host managed_node3 12033 1726867221.28758: done getting next task for host managed_node3 12033 1726867221.28761: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12033 1726867221.28767: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867221.28788: getting variables 12033 1726867221.28790: in VariableManager get_vars() 12033 1726867221.28832: Calling all_inventory to load vars for managed_node3 12033 1726867221.28835: Calling groups_inventory to load vars for managed_node3 12033 1726867221.28837: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867221.28845: Calling all_plugins_play to load vars for managed_node3 12033 1726867221.28848: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867221.28850: Calling groups_plugins_play to load vars for managed_node3 12033 1726867221.29391: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e13 12033 1726867221.29395: WORKER PROCESS EXITING 12033 1726867221.30245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867221.31405: done with get_vars() 12033 1726867221.31424: done getting variables 12033 1726867221.31465: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:20:21 -0400 (0:00:00.093) 0:01:00.430 ****** 12033 1726867221.31496: entering _queue_task() for managed_node3/package 12033 1726867221.31754: worker is 1 (out of 1 available) 12033 1726867221.31767: exiting _queue_task() for managed_node3/package 12033 1726867221.31782: done queuing things up, now waiting for results queue to drain 12033 1726867221.31784: waiting for pending results... 12033 1726867221.31966: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 12033 1726867221.32071: in run() - task 0affcac9-a3a5-74bb-502b-000000000e14 12033 1726867221.32084: variable 'ansible_search_path' from source: unknown 12033 1726867221.32088: variable 'ansible_search_path' from source: unknown 12033 1726867221.32119: calling self._execute() 12033 1726867221.32189: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867221.32193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867221.32205: variable 'omit' from source: magic vars 12033 1726867221.32475: variable 'ansible_distribution_major_version' from source: facts 12033 1726867221.32486: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867221.32663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867221.33083: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867221.33088: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867221.33090: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867221.33092: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867221.33172: variable 'network_packages' from source: role '' defaults 12033 1726867221.33280: variable '__network_provider_setup' from source: role '' defaults 12033 1726867221.33296: variable '__network_service_name_default_nm' from source: role '' defaults 12033 1726867221.33362: variable '__network_service_name_default_nm' from source: role '' defaults 12033 1726867221.33379: variable '__network_packages_default_nm' from source: role '' defaults 12033 1726867221.33482: variable '__network_packages_default_nm' from source: role '' defaults 12033 1726867221.33667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867221.35012: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867221.35299: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867221.35329: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867221.35351: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867221.35374: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867221.35435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.35454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.35473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.35508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.35517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.35571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.35586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.35603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.35683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.35686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.35886: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12033 1726867221.36007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.36037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.36068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.36116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.36184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.36231: variable 'ansible_python' from source: facts 12033 1726867221.36253: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12033 1726867221.36340: variable '__network_wpa_supplicant_required' from source: role '' defaults 12033 1726867221.36429: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12033 1726867221.36552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.36583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.36615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.36684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.36690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.36728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.36763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.36982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.36986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.36988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.37013: variable 'network_connections' from source: task vars 12033 1726867221.37024: variable 'port2_profile' from source: play vars 12033 1726867221.37130: variable 'port2_profile' from source: play vars 12033 1726867221.37139: variable 'port1_profile' from source: play vars 12033 1726867221.37219: variable 'port1_profile' from source: play vars 12033 1726867221.37229: variable 'controller_profile' from source: play vars 12033 1726867221.37297: variable 'controller_profile' from source: play vars 12033 1726867221.37353: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867221.37372: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867221.37394: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.37419: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867221.37462: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867221.37637: variable 'network_connections' from source: task vars 12033 1726867221.37643: variable 'port2_profile' from source: play vars 12033 1726867221.37715: variable 'port2_profile' from source: play vars 12033 1726867221.37723: variable 'port1_profile' from source: play vars 12033 1726867221.37793: variable 'port1_profile' from source: play vars 12033 1726867221.37801: variable 'controller_profile' from source: play vars 12033 1726867221.37874: variable 'controller_profile' from source: play vars 12033 1726867221.37895: variable '__network_packages_default_wireless' from source: role '' defaults 12033 1726867221.37951: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867221.38149: variable 'network_connections' from source: task vars 12033 1726867221.38152: variable 'port2_profile' from source: play vars 12033 1726867221.38203: variable 'port2_profile' from source: play vars 12033 1726867221.38209: variable 'port1_profile' from source: play vars 12033 1726867221.38253: variable 'port1_profile' from source: play vars 12033 1726867221.38259: variable 'controller_profile' from source: play vars 12033 1726867221.38306: variable 'controller_profile' from source: play vars 12033 1726867221.38327: variable '__network_packages_default_team' from source: role '' defaults 12033 1726867221.38381: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867221.38575: variable 'network_connections' from source: task vars 12033 1726867221.38580: variable 'port2_profile' from source: play vars 12033 1726867221.38678: variable 'port2_profile' from source: play vars 12033 1726867221.38682: variable 'port1_profile' from source: play vars 12033 1726867221.38708: variable 'port1_profile' from source: play vars 12033 1726867221.38722: variable 'controller_profile' from source: play vars 12033 1726867221.38791: variable 'controller_profile' from source: play vars 12033 1726867221.38850: variable '__network_service_name_default_initscripts' from source: role '' defaults 12033 1726867221.38982: variable '__network_service_name_default_initscripts' from source: role '' defaults 12033 1726867221.38985: variable '__network_packages_default_initscripts' from source: role '' defaults 12033 1726867221.38988: variable '__network_packages_default_initscripts' from source: role '' defaults 12033 1726867221.39221: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12033 1726867221.44390: variable 'network_connections' from source: task vars 12033 1726867221.44394: variable 'port2_profile' from source: play vars 12033 1726867221.44447: variable 'port2_profile' from source: play vars 12033 1726867221.44453: variable 'port1_profile' from source: play vars 12033 1726867221.44496: variable 'port1_profile' from source: play vars 12033 1726867221.44503: variable 'controller_profile' from source: play vars 12033 1726867221.44548: variable 'controller_profile' from source: play vars 12033 1726867221.44555: variable 'ansible_distribution' from source: facts 12033 1726867221.44558: variable '__network_rh_distros' from source: role '' defaults 12033 1726867221.44563: variable 'ansible_distribution_major_version' from source: facts 12033 1726867221.44576: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12033 1726867221.44688: variable 'ansible_distribution' from source: facts 12033 1726867221.44692: variable '__network_rh_distros' from source: role '' defaults 12033 1726867221.44695: variable 'ansible_distribution_major_version' from source: facts 12033 1726867221.44708: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12033 1726867221.44814: variable 'ansible_distribution' from source: facts 12033 1726867221.44817: variable '__network_rh_distros' from source: role '' defaults 12033 1726867221.44820: variable 'ansible_distribution_major_version' from source: facts 12033 1726867221.44847: variable 'network_provider' from source: set_fact 12033 1726867221.44860: variable 'ansible_facts' from source: unknown 12033 1726867221.45186: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12033 1726867221.45189: when evaluation is False, skipping this task 12033 1726867221.45191: _execute() done 12033 1726867221.45193: dumping result to json 12033 1726867221.45195: done dumping result, returning 12033 1726867221.45198: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-74bb-502b-000000000e14] 12033 1726867221.45200: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e14 12033 1726867221.45291: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e14 12033 1726867221.45294: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12033 1726867221.45341: no more pending results, returning what we have 12033 1726867221.45345: results queue empty 12033 1726867221.45346: checking for any_errors_fatal 12033 1726867221.45353: done checking for any_errors_fatal 12033 1726867221.45353: checking for max_fail_percentage 12033 1726867221.45355: done checking for max_fail_percentage 12033 1726867221.45356: checking to see if all hosts have failed and the running result is not ok 12033 1726867221.45357: done checking to see if all hosts have failed 12033 1726867221.45357: getting the remaining hosts for this loop 12033 1726867221.45359: done getting the remaining hosts for this loop 12033 1726867221.45367: getting the next task for host managed_node3 12033 1726867221.45374: done getting next task for host managed_node3 12033 1726867221.45386: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12033 1726867221.45391: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867221.45414: getting variables 12033 1726867221.45415: in VariableManager get_vars() 12033 1726867221.45457: Calling all_inventory to load vars for managed_node3 12033 1726867221.45459: Calling groups_inventory to load vars for managed_node3 12033 1726867221.45461: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867221.45470: Calling all_plugins_play to load vars for managed_node3 12033 1726867221.45472: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867221.45475: Calling groups_plugins_play to load vars for managed_node3 12033 1726867221.50008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867221.50853: done with get_vars() 12033 1726867221.50869: done getting variables 12033 1726867221.50908: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:20:21 -0400 (0:00:00.194) 0:01:00.625 ****** 12033 1726867221.50931: entering _queue_task() for managed_node3/package 12033 1726867221.51199: worker is 1 (out of 1 available) 12033 1726867221.51215: exiting _queue_task() for managed_node3/package 12033 1726867221.51227: done queuing things up, now waiting for results queue to drain 12033 1726867221.51230: waiting for pending results... 12033 1726867221.51425: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12033 1726867221.51543: in run() - task 0affcac9-a3a5-74bb-502b-000000000e15 12033 1726867221.51554: variable 'ansible_search_path' from source: unknown 12033 1726867221.51561: variable 'ansible_search_path' from source: unknown 12033 1726867221.51591: calling self._execute() 12033 1726867221.51664: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867221.51669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867221.51682: variable 'omit' from source: magic vars 12033 1726867221.51959: variable 'ansible_distribution_major_version' from source: facts 12033 1726867221.51970: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867221.52057: variable 'network_state' from source: role '' defaults 12033 1726867221.52066: Evaluated conditional (network_state != {}): False 12033 1726867221.52070: when evaluation is False, skipping this task 12033 1726867221.52073: _execute() done 12033 1726867221.52076: dumping result to json 12033 1726867221.52080: done dumping result, returning 12033 1726867221.52087: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-74bb-502b-000000000e15] 12033 1726867221.52092: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e15 12033 1726867221.52189: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e15 12033 1726867221.52192: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867221.52260: no more pending results, returning what we have 12033 1726867221.52264: results queue empty 12033 1726867221.52265: checking for any_errors_fatal 12033 1726867221.52273: done checking for any_errors_fatal 12033 1726867221.52273: checking for max_fail_percentage 12033 1726867221.52275: done checking for max_fail_percentage 12033 1726867221.52276: checking to see if all hosts have failed and the running result is not ok 12033 1726867221.52276: done checking to see if all hosts have failed 12033 1726867221.52278: getting the remaining hosts for this loop 12033 1726867221.52280: done getting the remaining hosts for this loop 12033 1726867221.52284: getting the next task for host managed_node3 12033 1726867221.52291: done getting next task for host managed_node3 12033 1726867221.52295: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12033 1726867221.52300: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867221.52321: getting variables 12033 1726867221.52324: in VariableManager get_vars() 12033 1726867221.52359: Calling all_inventory to load vars for managed_node3 12033 1726867221.52361: Calling groups_inventory to load vars for managed_node3 12033 1726867221.52363: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867221.52371: Calling all_plugins_play to load vars for managed_node3 12033 1726867221.52373: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867221.52375: Calling groups_plugins_play to load vars for managed_node3 12033 1726867221.53127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867221.53999: done with get_vars() 12033 1726867221.54017: done getting variables 12033 1726867221.54057: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:20:21 -0400 (0:00:00.031) 0:01:00.656 ****** 12033 1726867221.54084: entering _queue_task() for managed_node3/package 12033 1726867221.54314: worker is 1 (out of 1 available) 12033 1726867221.54327: exiting _queue_task() for managed_node3/package 12033 1726867221.54339: done queuing things up, now waiting for results queue to drain 12033 1726867221.54340: waiting for pending results... 12033 1726867221.54517: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12033 1726867221.54619: in run() - task 0affcac9-a3a5-74bb-502b-000000000e16 12033 1726867221.54631: variable 'ansible_search_path' from source: unknown 12033 1726867221.54635: variable 'ansible_search_path' from source: unknown 12033 1726867221.54662: calling self._execute() 12033 1726867221.54734: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867221.54737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867221.54746: variable 'omit' from source: magic vars 12033 1726867221.55220: variable 'ansible_distribution_major_version' from source: facts 12033 1726867221.55239: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867221.55368: variable 'network_state' from source: role '' defaults 12033 1726867221.55389: Evaluated conditional (network_state != {}): False 12033 1726867221.55400: when evaluation is False, skipping this task 12033 1726867221.55409: _execute() done 12033 1726867221.55418: dumping result to json 12033 1726867221.55428: done dumping result, returning 12033 1726867221.55440: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-74bb-502b-000000000e16] 12033 1726867221.55453: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e16 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867221.55614: no more pending results, returning what we have 12033 1726867221.55618: results queue empty 12033 1726867221.55619: checking for any_errors_fatal 12033 1726867221.55782: done checking for any_errors_fatal 12033 1726867221.55783: checking for max_fail_percentage 12033 1726867221.55785: done checking for max_fail_percentage 12033 1726867221.55786: checking to see if all hosts have failed and the running result is not ok 12033 1726867221.55786: done checking to see if all hosts have failed 12033 1726867221.55787: getting the remaining hosts for this loop 12033 1726867221.55788: done getting the remaining hosts for this loop 12033 1726867221.55792: getting the next task for host managed_node3 12033 1726867221.55798: done getting next task for host managed_node3 12033 1726867221.55803: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12033 1726867221.55808: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867221.55825: getting variables 12033 1726867221.55826: in VariableManager get_vars() 12033 1726867221.55862: Calling all_inventory to load vars for managed_node3 12033 1726867221.55864: Calling groups_inventory to load vars for managed_node3 12033 1726867221.55866: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867221.55885: Calling all_plugins_play to load vars for managed_node3 12033 1726867221.55888: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867221.55892: Calling groups_plugins_play to load vars for managed_node3 12033 1726867221.56495: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e16 12033 1726867221.56499: WORKER PROCESS EXITING 12033 1726867221.57452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867221.59176: done with get_vars() 12033 1726867221.59199: done getting variables 12033 1726867221.59266: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:20:21 -0400 (0:00:00.052) 0:01:00.708 ****** 12033 1726867221.59308: entering _queue_task() for managed_node3/service 12033 1726867221.59546: worker is 1 (out of 1 available) 12033 1726867221.59559: exiting _queue_task() for managed_node3/service 12033 1726867221.59571: done queuing things up, now waiting for results queue to drain 12033 1726867221.59572: waiting for pending results... 12033 1726867221.59751: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12033 1726867221.59853: in run() - task 0affcac9-a3a5-74bb-502b-000000000e17 12033 1726867221.59866: variable 'ansible_search_path' from source: unknown 12033 1726867221.59869: variable 'ansible_search_path' from source: unknown 12033 1726867221.59899: calling self._execute() 12033 1726867221.59972: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867221.59976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867221.59986: variable 'omit' from source: magic vars 12033 1726867221.60259: variable 'ansible_distribution_major_version' from source: facts 12033 1726867221.60269: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867221.60352: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867221.60484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867221.61959: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867221.62016: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867221.62043: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867221.62068: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867221.62094: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867221.62148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.62168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.62188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.62219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.62229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.62261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.62278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.62296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.62326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.62336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.62364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.62381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.62397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.62426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.62437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.62550: variable 'network_connections' from source: task vars 12033 1726867221.62559: variable 'port2_profile' from source: play vars 12033 1726867221.62608: variable 'port2_profile' from source: play vars 12033 1726867221.62616: variable 'port1_profile' from source: play vars 12033 1726867221.62660: variable 'port1_profile' from source: play vars 12033 1726867221.62667: variable 'controller_profile' from source: play vars 12033 1726867221.62711: variable 'controller_profile' from source: play vars 12033 1726867221.62762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867221.62882: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867221.62911: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867221.62935: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867221.62959: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867221.62990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867221.63007: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867221.63023: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.63040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867221.63084: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867221.63231: variable 'network_connections' from source: task vars 12033 1726867221.63235: variable 'port2_profile' from source: play vars 12033 1726867221.63280: variable 'port2_profile' from source: play vars 12033 1726867221.63292: variable 'port1_profile' from source: play vars 12033 1726867221.63330: variable 'port1_profile' from source: play vars 12033 1726867221.63337: variable 'controller_profile' from source: play vars 12033 1726867221.63379: variable 'controller_profile' from source: play vars 12033 1726867221.63404: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12033 1726867221.63416: when evaluation is False, skipping this task 12033 1726867221.63419: _execute() done 12033 1726867221.63421: dumping result to json 12033 1726867221.63424: done dumping result, returning 12033 1726867221.63426: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-74bb-502b-000000000e17] 12033 1726867221.63428: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e17 12033 1726867221.63516: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e17 12033 1726867221.63519: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12033 1726867221.63561: no more pending results, returning what we have 12033 1726867221.63565: results queue empty 12033 1726867221.63566: checking for any_errors_fatal 12033 1726867221.63572: done checking for any_errors_fatal 12033 1726867221.63573: checking for max_fail_percentage 12033 1726867221.63575: done checking for max_fail_percentage 12033 1726867221.63576: checking to see if all hosts have failed and the running result is not ok 12033 1726867221.63576: done checking to see if all hosts have failed 12033 1726867221.63579: getting the remaining hosts for this loop 12033 1726867221.63581: done getting the remaining hosts for this loop 12033 1726867221.63584: getting the next task for host managed_node3 12033 1726867221.63591: done getting next task for host managed_node3 12033 1726867221.63595: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12033 1726867221.63599: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867221.63622: getting variables 12033 1726867221.63624: in VariableManager get_vars() 12033 1726867221.63669: Calling all_inventory to load vars for managed_node3 12033 1726867221.63671: Calling groups_inventory to load vars for managed_node3 12033 1726867221.63673: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867221.63689: Calling all_plugins_play to load vars for managed_node3 12033 1726867221.63692: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867221.63695: Calling groups_plugins_play to load vars for managed_node3 12033 1726867221.64482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867221.65357: done with get_vars() 12033 1726867221.65372: done getting variables 12033 1726867221.65417: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:20:21 -0400 (0:00:00.061) 0:01:00.770 ****** 12033 1726867221.65441: entering _queue_task() for managed_node3/service 12033 1726867221.65666: worker is 1 (out of 1 available) 12033 1726867221.65682: exiting _queue_task() for managed_node3/service 12033 1726867221.65694: done queuing things up, now waiting for results queue to drain 12033 1726867221.65695: waiting for pending results... 12033 1726867221.65868: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12033 1726867221.65974: in run() - task 0affcac9-a3a5-74bb-502b-000000000e18 12033 1726867221.65986: variable 'ansible_search_path' from source: unknown 12033 1726867221.65990: variable 'ansible_search_path' from source: unknown 12033 1726867221.66019: calling self._execute() 12033 1726867221.66092: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867221.66096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867221.66108: variable 'omit' from source: magic vars 12033 1726867221.66369: variable 'ansible_distribution_major_version' from source: facts 12033 1726867221.66380: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867221.66489: variable 'network_provider' from source: set_fact 12033 1726867221.66493: variable 'network_state' from source: role '' defaults 12033 1726867221.66501: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12033 1726867221.66507: variable 'omit' from source: magic vars 12033 1726867221.66554: variable 'omit' from source: magic vars 12033 1726867221.66574: variable 'network_service_name' from source: role '' defaults 12033 1726867221.66628: variable 'network_service_name' from source: role '' defaults 12033 1726867221.66699: variable '__network_provider_setup' from source: role '' defaults 12033 1726867221.66705: variable '__network_service_name_default_nm' from source: role '' defaults 12033 1726867221.66747: variable '__network_service_name_default_nm' from source: role '' defaults 12033 1726867221.66755: variable '__network_packages_default_nm' from source: role '' defaults 12033 1726867221.66806: variable '__network_packages_default_nm' from source: role '' defaults 12033 1726867221.67033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867221.68751: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867221.68982: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867221.68985: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867221.68987: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867221.68989: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867221.69003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.69037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.69066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.69112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.69131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.69180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.69209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.69237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.69283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.69302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.69518: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12033 1726867221.69629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.69661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.69693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.69733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.69750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.69835: variable 'ansible_python' from source: facts 12033 1726867221.69853: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12033 1726867221.69932: variable '__network_wpa_supplicant_required' from source: role '' defaults 12033 1726867221.70010: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12033 1726867221.70132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.70161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.70192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.70233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.70257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.70315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867221.70337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867221.70374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.70399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867221.70411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867221.70509: variable 'network_connections' from source: task vars 12033 1726867221.70512: variable 'port2_profile' from source: play vars 12033 1726867221.70561: variable 'port2_profile' from source: play vars 12033 1726867221.70571: variable 'port1_profile' from source: play vars 12033 1726867221.70625: variable 'port1_profile' from source: play vars 12033 1726867221.70635: variable 'controller_profile' from source: play vars 12033 1726867221.70686: variable 'controller_profile' from source: play vars 12033 1726867221.70756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867221.70885: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867221.70923: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867221.70953: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867221.70982: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867221.71133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867221.71136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867221.71139: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867221.71142: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867221.71144: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867221.71298: variable 'network_connections' from source: task vars 12033 1726867221.71305: variable 'port2_profile' from source: play vars 12033 1726867221.71354: variable 'port2_profile' from source: play vars 12033 1726867221.71367: variable 'port1_profile' from source: play vars 12033 1726867221.71416: variable 'port1_profile' from source: play vars 12033 1726867221.71425: variable 'controller_profile' from source: play vars 12033 1726867221.71479: variable 'controller_profile' from source: play vars 12033 1726867221.71504: variable '__network_packages_default_wireless' from source: role '' defaults 12033 1726867221.71553: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867221.71738: variable 'network_connections' from source: task vars 12033 1726867221.71741: variable 'port2_profile' from source: play vars 12033 1726867221.71792: variable 'port2_profile' from source: play vars 12033 1726867221.71800: variable 'port1_profile' from source: play vars 12033 1726867221.71847: variable 'port1_profile' from source: play vars 12033 1726867221.71853: variable 'controller_profile' from source: play vars 12033 1726867221.71907: variable 'controller_profile' from source: play vars 12033 1726867221.71925: variable '__network_packages_default_team' from source: role '' defaults 12033 1726867221.71978: variable '__network_team_connections_defined' from source: role '' defaults 12033 1726867221.72167: variable 'network_connections' from source: task vars 12033 1726867221.72381: variable 'port2_profile' from source: play vars 12033 1726867221.72384: variable 'port2_profile' from source: play vars 12033 1726867221.72387: variable 'port1_profile' from source: play vars 12033 1726867221.72389: variable 'port1_profile' from source: play vars 12033 1726867221.72390: variable 'controller_profile' from source: play vars 12033 1726867221.72394: variable 'controller_profile' from source: play vars 12033 1726867221.72450: variable '__network_service_name_default_initscripts' from source: role '' defaults 12033 1726867221.72511: variable '__network_service_name_default_initscripts' from source: role '' defaults 12033 1726867221.72524: variable '__network_packages_default_initscripts' from source: role '' defaults 12033 1726867221.72609: variable '__network_packages_default_initscripts' from source: role '' defaults 12033 1726867221.72806: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12033 1726867221.73287: variable 'network_connections' from source: task vars 12033 1726867221.73297: variable 'port2_profile' from source: play vars 12033 1726867221.73356: variable 'port2_profile' from source: play vars 12033 1726867221.73370: variable 'port1_profile' from source: play vars 12033 1726867221.73432: variable 'port1_profile' from source: play vars 12033 1726867221.73444: variable 'controller_profile' from source: play vars 12033 1726867221.73505: variable 'controller_profile' from source: play vars 12033 1726867221.73517: variable 'ansible_distribution' from source: facts 12033 1726867221.73525: variable '__network_rh_distros' from source: role '' defaults 12033 1726867221.73535: variable 'ansible_distribution_major_version' from source: facts 12033 1726867221.73552: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12033 1726867221.73720: variable 'ansible_distribution' from source: facts 12033 1726867221.73729: variable '__network_rh_distros' from source: role '' defaults 12033 1726867221.73739: variable 'ansible_distribution_major_version' from source: facts 12033 1726867221.73756: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12033 1726867221.73920: variable 'ansible_distribution' from source: facts 12033 1726867221.73928: variable '__network_rh_distros' from source: role '' defaults 12033 1726867221.73937: variable 'ansible_distribution_major_version' from source: facts 12033 1726867221.73973: variable 'network_provider' from source: set_fact 12033 1726867221.74004: variable 'omit' from source: magic vars 12033 1726867221.74037: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867221.74071: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867221.74098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867221.74122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867221.74138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867221.74173: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867221.74187: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867221.74194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867221.74302: Set connection var ansible_pipelining to False 12033 1726867221.74317: Set connection var ansible_shell_executable to /bin/sh 12033 1726867221.74329: Set connection var ansible_timeout to 10 12033 1726867221.74338: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867221.74345: Set connection var ansible_connection to ssh 12033 1726867221.74355: Set connection var ansible_shell_type to sh 12033 1726867221.74385: variable 'ansible_shell_executable' from source: unknown 12033 1726867221.74393: variable 'ansible_connection' from source: unknown 12033 1726867221.74582: variable 'ansible_module_compression' from source: unknown 12033 1726867221.74585: variable 'ansible_shell_type' from source: unknown 12033 1726867221.74588: variable 'ansible_shell_executable' from source: unknown 12033 1726867221.74590: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867221.74592: variable 'ansible_pipelining' from source: unknown 12033 1726867221.74594: variable 'ansible_timeout' from source: unknown 12033 1726867221.74595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867221.74598: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867221.74600: variable 'omit' from source: magic vars 12033 1726867221.74602: starting attempt loop 12033 1726867221.74604: running the handler 12033 1726867221.74630: variable 'ansible_facts' from source: unknown 12033 1726867221.75296: _low_level_execute_command(): starting 12033 1726867221.75308: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867221.75973: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867221.75989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867221.76007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867221.76025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867221.76041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867221.76096: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867221.76149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867221.76166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867221.76190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867221.76280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867221.77967: stdout chunk (state=3): >>>/root <<< 12033 1726867221.78094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867221.78113: stderr chunk (state=3): >>><<< 12033 1726867221.78126: stdout chunk (state=3): >>><<< 12033 1726867221.78149: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867221.78166: _low_level_execute_command(): starting 12033 1726867221.78176: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867221.781547-14892-49378433337030 `" && echo ansible-tmp-1726867221.781547-14892-49378433337030="` echo /root/.ansible/tmp/ansible-tmp-1726867221.781547-14892-49378433337030 `" ) && sleep 0' 12033 1726867221.78812: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867221.78843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867221.78861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867221.78904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867221.78923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867221.78947: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867221.78992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867221.79068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867221.79096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867221.79150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867221.79225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867221.81091: stdout chunk (state=3): >>>ansible-tmp-1726867221.781547-14892-49378433337030=/root/.ansible/tmp/ansible-tmp-1726867221.781547-14892-49378433337030 <<< 12033 1726867221.81219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867221.81226: stdout chunk (state=3): >>><<< 12033 1726867221.81237: stderr chunk (state=3): >>><<< 12033 1726867221.81251: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867221.781547-14892-49378433337030=/root/.ansible/tmp/ansible-tmp-1726867221.781547-14892-49378433337030 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867221.81283: variable 'ansible_module_compression' from source: unknown 12033 1726867221.81324: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 12033 1726867221.81371: variable 'ansible_facts' from source: unknown 12033 1726867221.81506: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867221.781547-14892-49378433337030/AnsiballZ_systemd.py 12033 1726867221.81619: Sending initial data 12033 1726867221.81629: Sent initial data (154 bytes) 12033 1726867221.82295: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867221.82326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867221.82343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867221.82424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867221.82504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867221.84031: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867221.84072: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867221.84126: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmp6j9hnkdm /root/.ansible/tmp/ansible-tmp-1726867221.781547-14892-49378433337030/AnsiballZ_systemd.py <<< 12033 1726867221.84128: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867221.781547-14892-49378433337030/AnsiballZ_systemd.py" <<< 12033 1726867221.84165: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmp6j9hnkdm" to remote "/root/.ansible/tmp/ansible-tmp-1726867221.781547-14892-49378433337030/AnsiballZ_systemd.py" <<< 12033 1726867221.84171: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867221.781547-14892-49378433337030/AnsiballZ_systemd.py" <<< 12033 1726867221.85658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867221.85661: stderr chunk (state=3): >>><<< 12033 1726867221.85664: stdout chunk (state=3): >>><<< 12033 1726867221.85674: done transferring module to remote 12033 1726867221.85696: _low_level_execute_command(): starting 12033 1726867221.85706: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867221.781547-14892-49378433337030/ /root/.ansible/tmp/ansible-tmp-1726867221.781547-14892-49378433337030/AnsiballZ_systemd.py && sleep 0' 12033 1726867221.86192: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867221.86218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867221.86222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867221.86271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867221.86280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867221.86324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867221.88098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867221.88116: stderr chunk (state=3): >>><<< 12033 1726867221.88127: stdout chunk (state=3): >>><<< 12033 1726867221.88143: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867221.88224: _low_level_execute_command(): starting 12033 1726867221.88228: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867221.781547-14892-49378433337030/AnsiballZ_systemd.py && sleep 0' 12033 1726867221.88722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867221.88726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867221.88728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration <<< 12033 1726867221.88734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867221.88736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867221.88785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867221.88810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867221.88885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867222.18321: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10485760", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3321245696", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1022959000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 12033 1726867222.18336: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system<<< 12033 1726867222.18345: stdout chunk (state=3): >>>.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12033 1726867222.20274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867222.20311: stderr chunk (state=3): >>><<< 12033 1726867222.20314: stdout chunk (state=3): >>><<< 12033 1726867222.20332: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainStartTimestampMonotonic": "21397904", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ExecMainHandoffTimestampMonotonic": "21411941", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10485760", "MemoryPeak": "13291520", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3321245696", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1022959000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service shutdown.target multi-user.target network.target", "After": "system.slice systemd-journald.socket sysinit.target basic.target dbus.socket cloud-init-local.service network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:18:52 EDT", "StateChangeTimestampMonotonic": "369615617", "InactiveExitTimestamp": "Fri 2024-09-20 17:13:04 EDT", "InactiveExitTimestampMonotonic": "21398448", "ActiveEnterTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ActiveEnterTimestampMonotonic": "21815124", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:13:04 EDT", "ConditionTimestampMonotonic": "21397034", "AssertTimestamp": "Fri 2024-09-20 17:13:04 EDT", "AssertTimestampMonotonic": "21397036", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6662de0c35f9440589adc21215384405", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867222.20450: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867221.781547-14892-49378433337030/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867222.20466: _low_level_execute_command(): starting 12033 1726867222.20471: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867221.781547-14892-49378433337030/ > /dev/null 2>&1 && sleep 0' 12033 1726867222.20924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867222.20928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867222.20930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867222.20932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867222.20934: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867222.20936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867222.20983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867222.20989: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867222.21004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867222.21039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867222.22883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867222.22910: stderr chunk (state=3): >>><<< 12033 1726867222.22913: stdout chunk (state=3): >>><<< 12033 1726867222.22926: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867222.22932: handler run complete 12033 1726867222.22968: attempt loop complete, returning result 12033 1726867222.22971: _execute() done 12033 1726867222.22973: dumping result to json 12033 1726867222.22987: done dumping result, returning 12033 1726867222.22995: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-74bb-502b-000000000e18] 12033 1726867222.23002: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e18 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867222.23510: no more pending results, returning what we have 12033 1726867222.23513: results queue empty 12033 1726867222.23513: checking for any_errors_fatal 12033 1726867222.23516: done checking for any_errors_fatal 12033 1726867222.23517: checking for max_fail_percentage 12033 1726867222.23518: done checking for max_fail_percentage 12033 1726867222.23519: checking to see if all hosts have failed and the running result is not ok 12033 1726867222.23519: done checking to see if all hosts have failed 12033 1726867222.23520: getting the remaining hosts for this loop 12033 1726867222.23521: done getting the remaining hosts for this loop 12033 1726867222.23523: getting the next task for host managed_node3 12033 1726867222.23528: done getting next task for host managed_node3 12033 1726867222.23530: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12033 1726867222.23535: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867222.23544: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e18 12033 1726867222.23547: WORKER PROCESS EXITING 12033 1726867222.23553: getting variables 12033 1726867222.23554: in VariableManager get_vars() 12033 1726867222.23581: Calling all_inventory to load vars for managed_node3 12033 1726867222.23583: Calling groups_inventory to load vars for managed_node3 12033 1726867222.23584: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867222.23591: Calling all_plugins_play to load vars for managed_node3 12033 1726867222.23593: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867222.23595: Calling groups_plugins_play to load vars for managed_node3 12033 1726867222.24257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867222.25123: done with get_vars() 12033 1726867222.25138: done getting variables 12033 1726867222.25181: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:20:22 -0400 (0:00:00.597) 0:01:01.367 ****** 12033 1726867222.25210: entering _queue_task() for managed_node3/service 12033 1726867222.25435: worker is 1 (out of 1 available) 12033 1726867222.25446: exiting _queue_task() for managed_node3/service 12033 1726867222.25456: done queuing things up, now waiting for results queue to drain 12033 1726867222.25458: waiting for pending results... 12033 1726867222.25649: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12033 1726867222.25755: in run() - task 0affcac9-a3a5-74bb-502b-000000000e19 12033 1726867222.25766: variable 'ansible_search_path' from source: unknown 12033 1726867222.25770: variable 'ansible_search_path' from source: unknown 12033 1726867222.25801: calling self._execute() 12033 1726867222.25873: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867222.25879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867222.25888: variable 'omit' from source: magic vars 12033 1726867222.26167: variable 'ansible_distribution_major_version' from source: facts 12033 1726867222.26178: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867222.26267: variable 'network_provider' from source: set_fact 12033 1726867222.26271: Evaluated conditional (network_provider == "nm"): True 12033 1726867222.26341: variable '__network_wpa_supplicant_required' from source: role '' defaults 12033 1726867222.26401: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12033 1726867222.26518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867222.27941: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867222.27986: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867222.28014: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867222.28038: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867222.28057: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867222.28220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867222.28240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867222.28257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867222.28284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867222.28300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867222.28331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867222.28347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867222.28364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867222.28390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867222.28404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867222.28432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867222.28449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867222.28464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867222.28490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867222.28500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867222.28597: variable 'network_connections' from source: task vars 12033 1726867222.28607: variable 'port2_profile' from source: play vars 12033 1726867222.28652: variable 'port2_profile' from source: play vars 12033 1726867222.28661: variable 'port1_profile' from source: play vars 12033 1726867222.28706: variable 'port1_profile' from source: play vars 12033 1726867222.28711: variable 'controller_profile' from source: play vars 12033 1726867222.28756: variable 'controller_profile' from source: play vars 12033 1726867222.28806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12033 1726867222.28913: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12033 1726867222.28943: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12033 1726867222.28967: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12033 1726867222.28990: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12033 1726867222.29020: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12033 1726867222.29035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12033 1726867222.29053: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867222.29074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12033 1726867222.29114: variable '__network_wireless_connections_defined' from source: role '' defaults 12033 1726867222.29276: variable 'network_connections' from source: task vars 12033 1726867222.29280: variable 'port2_profile' from source: play vars 12033 1726867222.29324: variable 'port2_profile' from source: play vars 12033 1726867222.29330: variable 'port1_profile' from source: play vars 12033 1726867222.29372: variable 'port1_profile' from source: play vars 12033 1726867222.29385: variable 'controller_profile' from source: play vars 12033 1726867222.29424: variable 'controller_profile' from source: play vars 12033 1726867222.29447: Evaluated conditional (__network_wpa_supplicant_required): False 12033 1726867222.29451: when evaluation is False, skipping this task 12033 1726867222.29453: _execute() done 12033 1726867222.29456: dumping result to json 12033 1726867222.29458: done dumping result, returning 12033 1726867222.29464: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-74bb-502b-000000000e19] 12033 1726867222.29469: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e19 12033 1726867222.29556: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e19 12033 1726867222.29559: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12033 1726867222.29605: no more pending results, returning what we have 12033 1726867222.29609: results queue empty 12033 1726867222.29610: checking for any_errors_fatal 12033 1726867222.29636: done checking for any_errors_fatal 12033 1726867222.29636: checking for max_fail_percentage 12033 1726867222.29638: done checking for max_fail_percentage 12033 1726867222.29639: checking to see if all hosts have failed and the running result is not ok 12033 1726867222.29640: done checking to see if all hosts have failed 12033 1726867222.29640: getting the remaining hosts for this loop 12033 1726867222.29642: done getting the remaining hosts for this loop 12033 1726867222.29646: getting the next task for host managed_node3 12033 1726867222.29654: done getting next task for host managed_node3 12033 1726867222.29657: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12033 1726867222.29661: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867222.29681: getting variables 12033 1726867222.29683: in VariableManager get_vars() 12033 1726867222.29726: Calling all_inventory to load vars for managed_node3 12033 1726867222.29729: Calling groups_inventory to load vars for managed_node3 12033 1726867222.29731: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867222.29740: Calling all_plugins_play to load vars for managed_node3 12033 1726867222.29742: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867222.29744: Calling groups_plugins_play to load vars for managed_node3 12033 1726867222.30582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867222.31456: done with get_vars() 12033 1726867222.31471: done getting variables 12033 1726867222.31515: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:20:22 -0400 (0:00:00.063) 0:01:01.431 ****** 12033 1726867222.31538: entering _queue_task() for managed_node3/service 12033 1726867222.31754: worker is 1 (out of 1 available) 12033 1726867222.31769: exiting _queue_task() for managed_node3/service 12033 1726867222.31783: done queuing things up, now waiting for results queue to drain 12033 1726867222.31785: waiting for pending results... 12033 1726867222.31958: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 12033 1726867222.32059: in run() - task 0affcac9-a3a5-74bb-502b-000000000e1a 12033 1726867222.32070: variable 'ansible_search_path' from source: unknown 12033 1726867222.32075: variable 'ansible_search_path' from source: unknown 12033 1726867222.32106: calling self._execute() 12033 1726867222.32174: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867222.32180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867222.32188: variable 'omit' from source: magic vars 12033 1726867222.32457: variable 'ansible_distribution_major_version' from source: facts 12033 1726867222.32467: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867222.32545: variable 'network_provider' from source: set_fact 12033 1726867222.32551: Evaluated conditional (network_provider == "initscripts"): False 12033 1726867222.32555: when evaluation is False, skipping this task 12033 1726867222.32558: _execute() done 12033 1726867222.32560: dumping result to json 12033 1726867222.32563: done dumping result, returning 12033 1726867222.32566: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-74bb-502b-000000000e1a] 12033 1726867222.32576: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e1a 12033 1726867222.32659: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e1a 12033 1726867222.32662: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12033 1726867222.32724: no more pending results, returning what we have 12033 1726867222.32727: results queue empty 12033 1726867222.32728: checking for any_errors_fatal 12033 1726867222.32733: done checking for any_errors_fatal 12033 1726867222.32734: checking for max_fail_percentage 12033 1726867222.32735: done checking for max_fail_percentage 12033 1726867222.32736: checking to see if all hosts have failed and the running result is not ok 12033 1726867222.32737: done checking to see if all hosts have failed 12033 1726867222.32737: getting the remaining hosts for this loop 12033 1726867222.32739: done getting the remaining hosts for this loop 12033 1726867222.32741: getting the next task for host managed_node3 12033 1726867222.32748: done getting next task for host managed_node3 12033 1726867222.32751: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12033 1726867222.32756: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867222.32772: getting variables 12033 1726867222.32774: in VariableManager get_vars() 12033 1726867222.32814: Calling all_inventory to load vars for managed_node3 12033 1726867222.32817: Calling groups_inventory to load vars for managed_node3 12033 1726867222.32819: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867222.32826: Calling all_plugins_play to load vars for managed_node3 12033 1726867222.32828: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867222.32830: Calling groups_plugins_play to load vars for managed_node3 12033 1726867222.33544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867222.34422: done with get_vars() 12033 1726867222.34436: done getting variables 12033 1726867222.34476: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:20:22 -0400 (0:00:00.029) 0:01:01.460 ****** 12033 1726867222.34505: entering _queue_task() for managed_node3/copy 12033 1726867222.34698: worker is 1 (out of 1 available) 12033 1726867222.34713: exiting _queue_task() for managed_node3/copy 12033 1726867222.34725: done queuing things up, now waiting for results queue to drain 12033 1726867222.34726: waiting for pending results... 12033 1726867222.34887: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12033 1726867222.34988: in run() - task 0affcac9-a3a5-74bb-502b-000000000e1b 12033 1726867222.34998: variable 'ansible_search_path' from source: unknown 12033 1726867222.35004: variable 'ansible_search_path' from source: unknown 12033 1726867222.35027: calling self._execute() 12033 1726867222.35095: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867222.35099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867222.35108: variable 'omit' from source: magic vars 12033 1726867222.35359: variable 'ansible_distribution_major_version' from source: facts 12033 1726867222.35368: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867222.35449: variable 'network_provider' from source: set_fact 12033 1726867222.35453: Evaluated conditional (network_provider == "initscripts"): False 12033 1726867222.35456: when evaluation is False, skipping this task 12033 1726867222.35458: _execute() done 12033 1726867222.35461: dumping result to json 12033 1726867222.35463: done dumping result, returning 12033 1726867222.35472: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-74bb-502b-000000000e1b] 12033 1726867222.35475: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e1b 12033 1726867222.35567: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e1b 12033 1726867222.35570: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12033 1726867222.35635: no more pending results, returning what we have 12033 1726867222.35638: results queue empty 12033 1726867222.35639: checking for any_errors_fatal 12033 1726867222.35644: done checking for any_errors_fatal 12033 1726867222.35644: checking for max_fail_percentage 12033 1726867222.35646: done checking for max_fail_percentage 12033 1726867222.35648: checking to see if all hosts have failed and the running result is not ok 12033 1726867222.35648: done checking to see if all hosts have failed 12033 1726867222.35649: getting the remaining hosts for this loop 12033 1726867222.35650: done getting the remaining hosts for this loop 12033 1726867222.35653: getting the next task for host managed_node3 12033 1726867222.35659: done getting next task for host managed_node3 12033 1726867222.35662: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12033 1726867222.35667: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867222.35685: getting variables 12033 1726867222.35687: in VariableManager get_vars() 12033 1726867222.35720: Calling all_inventory to load vars for managed_node3 12033 1726867222.35722: Calling groups_inventory to load vars for managed_node3 12033 1726867222.35724: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867222.35729: Calling all_plugins_play to load vars for managed_node3 12033 1726867222.35731: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867222.35733: Calling groups_plugins_play to load vars for managed_node3 12033 1726867222.36556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867222.37609: done with get_vars() 12033 1726867222.37624: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:20:22 -0400 (0:00:00.031) 0:01:01.492 ****** 12033 1726867222.37683: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 12033 1726867222.37893: worker is 1 (out of 1 available) 12033 1726867222.37908: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 12033 1726867222.37920: done queuing things up, now waiting for results queue to drain 12033 1726867222.37921: waiting for pending results... 12033 1726867222.38095: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12033 1726867222.38197: in run() - task 0affcac9-a3a5-74bb-502b-000000000e1c 12033 1726867222.38211: variable 'ansible_search_path' from source: unknown 12033 1726867222.38216: variable 'ansible_search_path' from source: unknown 12033 1726867222.38242: calling self._execute() 12033 1726867222.38312: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867222.38316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867222.38325: variable 'omit' from source: magic vars 12033 1726867222.38654: variable 'ansible_distribution_major_version' from source: facts 12033 1726867222.38658: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867222.38660: variable 'omit' from source: magic vars 12033 1726867222.38715: variable 'omit' from source: magic vars 12033 1726867222.38874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12033 1726867222.41134: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12033 1726867222.41205: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12033 1726867222.41257: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12033 1726867222.41300: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12033 1726867222.41446: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12033 1726867222.41451: variable 'network_provider' from source: set_fact 12033 1726867222.41568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12033 1726867222.41605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12033 1726867222.41637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12033 1726867222.41693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12033 1726867222.41717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12033 1726867222.41804: variable 'omit' from source: magic vars 12033 1726867222.41936: variable 'omit' from source: magic vars 12033 1726867222.42014: variable 'network_connections' from source: task vars 12033 1726867222.42023: variable 'port2_profile' from source: play vars 12033 1726867222.42072: variable 'port2_profile' from source: play vars 12033 1726867222.42080: variable 'port1_profile' from source: play vars 12033 1726867222.42126: variable 'port1_profile' from source: play vars 12033 1726867222.42133: variable 'controller_profile' from source: play vars 12033 1726867222.42175: variable 'controller_profile' from source: play vars 12033 1726867222.42291: variable 'omit' from source: magic vars 12033 1726867222.42298: variable '__lsr_ansible_managed' from source: task vars 12033 1726867222.42344: variable '__lsr_ansible_managed' from source: task vars 12033 1726867222.42479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12033 1726867222.42631: Loaded config def from plugin (lookup/template) 12033 1726867222.42634: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12033 1726867222.42657: File lookup term: get_ansible_managed.j2 12033 1726867222.42660: variable 'ansible_search_path' from source: unknown 12033 1726867222.42665: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12033 1726867222.42676: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12033 1726867222.42691: variable 'ansible_search_path' from source: unknown 12033 1726867222.47984: variable 'ansible_managed' from source: unknown 12033 1726867222.48061: variable 'omit' from source: magic vars 12033 1726867222.48086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867222.48114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867222.48129: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867222.48145: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867222.48155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867222.48181: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867222.48184: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867222.48187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867222.48276: Set connection var ansible_pipelining to False 12033 1726867222.48327: Set connection var ansible_shell_executable to /bin/sh 12033 1726867222.48330: Set connection var ansible_timeout to 10 12033 1726867222.48335: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867222.48337: Set connection var ansible_connection to ssh 12033 1726867222.48339: Set connection var ansible_shell_type to sh 12033 1726867222.48342: variable 'ansible_shell_executable' from source: unknown 12033 1726867222.48345: variable 'ansible_connection' from source: unknown 12033 1726867222.48347: variable 'ansible_module_compression' from source: unknown 12033 1726867222.48349: variable 'ansible_shell_type' from source: unknown 12033 1726867222.48351: variable 'ansible_shell_executable' from source: unknown 12033 1726867222.48353: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867222.48355: variable 'ansible_pipelining' from source: unknown 12033 1726867222.48356: variable 'ansible_timeout' from source: unknown 12033 1726867222.48366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867222.48548: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867222.48552: variable 'omit' from source: magic vars 12033 1726867222.48554: starting attempt loop 12033 1726867222.48556: running the handler 12033 1726867222.48559: _low_level_execute_command(): starting 12033 1726867222.48561: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867222.49163: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867222.49174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867222.49187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867222.49204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867222.49216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867222.49223: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867222.49233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867222.49246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867222.49254: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867222.49261: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867222.49268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867222.49279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867222.49294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867222.49380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867222.49385: stderr chunk (state=3): >>>debug2: match found <<< 12033 1726867222.49387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867222.49389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867222.49418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867222.49431: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867222.49506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867222.51200: stdout chunk (state=3): >>>/root <<< 12033 1726867222.51302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867222.51436: stderr chunk (state=3): >>><<< 12033 1726867222.51439: stdout chunk (state=3): >>><<< 12033 1726867222.51442: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867222.51444: _low_level_execute_command(): starting 12033 1726867222.51447: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867222.5135386-14911-222348496184675 `" && echo ansible-tmp-1726867222.5135386-14911-222348496184675="` echo /root/.ansible/tmp/ansible-tmp-1726867222.5135386-14911-222348496184675 `" ) && sleep 0' 12033 1726867222.51979: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867222.51993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867222.52010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867222.52032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867222.52048: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867222.52059: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867222.52071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867222.52090: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867222.52142: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867222.52194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867222.52213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867222.52237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867222.52314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867222.54194: stdout chunk (state=3): >>>ansible-tmp-1726867222.5135386-14911-222348496184675=/root/.ansible/tmp/ansible-tmp-1726867222.5135386-14911-222348496184675 <<< 12033 1726867222.54358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867222.54361: stdout chunk (state=3): >>><<< 12033 1726867222.54364: stderr chunk (state=3): >>><<< 12033 1726867222.54686: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867222.5135386-14911-222348496184675=/root/.ansible/tmp/ansible-tmp-1726867222.5135386-14911-222348496184675 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867222.54695: variable 'ansible_module_compression' from source: unknown 12033 1726867222.54698: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 12033 1726867222.54735: variable 'ansible_facts' from source: unknown 12033 1726867222.54890: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867222.5135386-14911-222348496184675/AnsiballZ_network_connections.py 12033 1726867222.55027: Sending initial data 12033 1726867222.55091: Sent initial data (168 bytes) 12033 1726867222.55589: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867222.55601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867222.55614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867222.55686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867222.55689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867222.55737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867222.57313: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12033 1726867222.57327: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 12033 1726867222.57346: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 12033 1726867222.57376: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867222.57428: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867222.57503: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpgwfirg2j /root/.ansible/tmp/ansible-tmp-1726867222.5135386-14911-222348496184675/AnsiballZ_network_connections.py <<< 12033 1726867222.57506: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867222.5135386-14911-222348496184675/AnsiballZ_network_connections.py" <<< 12033 1726867222.57535: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpgwfirg2j" to remote "/root/.ansible/tmp/ansible-tmp-1726867222.5135386-14911-222348496184675/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867222.5135386-14911-222348496184675/AnsiballZ_network_connections.py" <<< 12033 1726867222.59182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867222.59185: stdout chunk (state=3): >>><<< 12033 1726867222.59188: stderr chunk (state=3): >>><<< 12033 1726867222.59190: done transferring module to remote 12033 1726867222.59192: _low_level_execute_command(): starting 12033 1726867222.59194: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867222.5135386-14911-222348496184675/ /root/.ansible/tmp/ansible-tmp-1726867222.5135386-14911-222348496184675/AnsiballZ_network_connections.py && sleep 0' 12033 1726867222.59836: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867222.59850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867222.59863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867222.59988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867222.59992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867222.60029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867222.60097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867222.62095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867222.62099: stdout chunk (state=3): >>><<< 12033 1726867222.62101: stderr chunk (state=3): >>><<< 12033 1726867222.62107: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867222.62109: _low_level_execute_command(): starting 12033 1726867222.62111: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867222.5135386-14911-222348496184675/AnsiballZ_network_connections.py && sleep 0' 12033 1726867222.63156: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867222.63171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867222.63207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867222.63227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867222.63293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867222.63483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867222.63613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867223.20002: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_vk8m3rw0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_vk8m3rw0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/3fbfe47f-1f56-4efb-93c9-a69913e19a91: error=unknown <<< 12033 1726867223.21624: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_vk8m3rw0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_vk8m3rw0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/67803b09-25b3-4aa2-bd80-0afe9d2728fb: error=unknown <<< 12033 1726867223.23293: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_vk8m3rw0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 12033 1726867223.23326: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_vk8m3rw0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/560fcc66-c438-49a7-835e-aed531442b3e: error=unknown <<< 12033 1726867223.23610: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12033 1726867223.25481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867223.25493: stdout chunk (state=3): >>><<< 12033 1726867223.25517: stderr chunk (state=3): >>><<< 12033 1726867223.25681: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_vk8m3rw0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_vk8m3rw0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/3fbfe47f-1f56-4efb-93c9-a69913e19a91: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_vk8m3rw0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_vk8m3rw0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/67803b09-25b3-4aa2-bd80-0afe9d2728fb: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_vk8m3rw0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_vk8m3rw0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/560fcc66-c438-49a7-835e-aed531442b3e: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867223.25689: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867222.5135386-14911-222348496184675/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867223.25692: _low_level_execute_command(): starting 12033 1726867223.25695: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867222.5135386-14911-222348496184675/ > /dev/null 2>&1 && sleep 0' 12033 1726867223.26285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867223.26298: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867223.26375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867223.26426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867223.26450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867223.26526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867223.28444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867223.28466: stdout chunk (state=3): >>><<< 12033 1726867223.28482: stderr chunk (state=3): >>><<< 12033 1726867223.28504: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867223.28516: handler run complete 12033 1726867223.28551: attempt loop complete, returning result 12033 1726867223.28558: _execute() done 12033 1726867223.28583: dumping result to json 12033 1726867223.28588: done dumping result, returning 12033 1726867223.28641: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-74bb-502b-000000000e1c] 12033 1726867223.28644: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e1c 12033 1726867223.28745: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e1c 12033 1726867223.28749: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 12033 1726867223.28991: no more pending results, returning what we have 12033 1726867223.28997: results queue empty 12033 1726867223.28998: checking for any_errors_fatal 12033 1726867223.29004: done checking for any_errors_fatal 12033 1726867223.29005: checking for max_fail_percentage 12033 1726867223.29006: done checking for max_fail_percentage 12033 1726867223.29007: checking to see if all hosts have failed and the running result is not ok 12033 1726867223.29008: done checking to see if all hosts have failed 12033 1726867223.29009: getting the remaining hosts for this loop 12033 1726867223.29010: done getting the remaining hosts for this loop 12033 1726867223.29013: getting the next task for host managed_node3 12033 1726867223.29020: done getting next task for host managed_node3 12033 1726867223.29023: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12033 1726867223.29028: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867223.29040: getting variables 12033 1726867223.29041: in VariableManager get_vars() 12033 1726867223.29185: Calling all_inventory to load vars for managed_node3 12033 1726867223.29189: Calling groups_inventory to load vars for managed_node3 12033 1726867223.29196: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867223.29208: Calling all_plugins_play to load vars for managed_node3 12033 1726867223.29211: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867223.29214: Calling groups_plugins_play to load vars for managed_node3 12033 1726867223.30169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867223.31158: done with get_vars() 12033 1726867223.31174: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:20:23 -0400 (0:00:00.935) 0:01:02.428 ****** 12033 1726867223.31240: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 12033 1726867223.31797: worker is 1 (out of 1 available) 12033 1726867223.31807: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 12033 1726867223.31819: done queuing things up, now waiting for results queue to drain 12033 1726867223.31821: waiting for pending results... 12033 1726867223.31852: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 12033 1726867223.32085: in run() - task 0affcac9-a3a5-74bb-502b-000000000e1d 12033 1726867223.32089: variable 'ansible_search_path' from source: unknown 12033 1726867223.32091: variable 'ansible_search_path' from source: unknown 12033 1726867223.32097: calling self._execute() 12033 1726867223.32245: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867223.32276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867223.32300: variable 'omit' from source: magic vars 12033 1726867223.32648: variable 'ansible_distribution_major_version' from source: facts 12033 1726867223.32663: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867223.32747: variable 'network_state' from source: role '' defaults 12033 1726867223.32756: Evaluated conditional (network_state != {}): False 12033 1726867223.32760: when evaluation is False, skipping this task 12033 1726867223.32766: _execute() done 12033 1726867223.32768: dumping result to json 12033 1726867223.32773: done dumping result, returning 12033 1726867223.32780: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-74bb-502b-000000000e1d] 12033 1726867223.32786: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e1d 12033 1726867223.32875: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e1d 12033 1726867223.32880: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12033 1726867223.32961: no more pending results, returning what we have 12033 1726867223.32964: results queue empty 12033 1726867223.32965: checking for any_errors_fatal 12033 1726867223.32972: done checking for any_errors_fatal 12033 1726867223.32973: checking for max_fail_percentage 12033 1726867223.32975: done checking for max_fail_percentage 12033 1726867223.32976: checking to see if all hosts have failed and the running result is not ok 12033 1726867223.32978: done checking to see if all hosts have failed 12033 1726867223.32979: getting the remaining hosts for this loop 12033 1726867223.32981: done getting the remaining hosts for this loop 12033 1726867223.32984: getting the next task for host managed_node3 12033 1726867223.32990: done getting next task for host managed_node3 12033 1726867223.32993: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12033 1726867223.32997: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867223.33017: getting variables 12033 1726867223.33018: in VariableManager get_vars() 12033 1726867223.33053: Calling all_inventory to load vars for managed_node3 12033 1726867223.33056: Calling groups_inventory to load vars for managed_node3 12033 1726867223.33058: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867223.33066: Calling all_plugins_play to load vars for managed_node3 12033 1726867223.33068: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867223.33071: Calling groups_plugins_play to load vars for managed_node3 12033 1726867223.33809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867223.34912: done with get_vars() 12033 1726867223.34931: done getting variables 12033 1726867223.34991: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:20:23 -0400 (0:00:00.037) 0:01:02.466 ****** 12033 1726867223.35030: entering _queue_task() for managed_node3/debug 12033 1726867223.35297: worker is 1 (out of 1 available) 12033 1726867223.35313: exiting _queue_task() for managed_node3/debug 12033 1726867223.35324: done queuing things up, now waiting for results queue to drain 12033 1726867223.35326: waiting for pending results... 12033 1726867223.35793: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12033 1726867223.35798: in run() - task 0affcac9-a3a5-74bb-502b-000000000e1e 12033 1726867223.35804: variable 'ansible_search_path' from source: unknown 12033 1726867223.35807: variable 'ansible_search_path' from source: unknown 12033 1726867223.35837: calling self._execute() 12033 1726867223.35946: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867223.35950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867223.35959: variable 'omit' from source: magic vars 12033 1726867223.36251: variable 'ansible_distribution_major_version' from source: facts 12033 1726867223.36255: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867223.36261: variable 'omit' from source: magic vars 12033 1726867223.36309: variable 'omit' from source: magic vars 12033 1726867223.36333: variable 'omit' from source: magic vars 12033 1726867223.36366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867223.36393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867223.36408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867223.36421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867223.36431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867223.36454: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867223.36462: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867223.36465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867223.36530: Set connection var ansible_pipelining to False 12033 1726867223.36537: Set connection var ansible_shell_executable to /bin/sh 12033 1726867223.36544: Set connection var ansible_timeout to 10 12033 1726867223.36548: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867223.36551: Set connection var ansible_connection to ssh 12033 1726867223.36556: Set connection var ansible_shell_type to sh 12033 1726867223.36575: variable 'ansible_shell_executable' from source: unknown 12033 1726867223.36579: variable 'ansible_connection' from source: unknown 12033 1726867223.36582: variable 'ansible_module_compression' from source: unknown 12033 1726867223.36584: variable 'ansible_shell_type' from source: unknown 12033 1726867223.36586: variable 'ansible_shell_executable' from source: unknown 12033 1726867223.36589: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867223.36591: variable 'ansible_pipelining' from source: unknown 12033 1726867223.36598: variable 'ansible_timeout' from source: unknown 12033 1726867223.36600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867223.36695: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867223.36709: variable 'omit' from source: magic vars 12033 1726867223.36713: starting attempt loop 12033 1726867223.36715: running the handler 12033 1726867223.36808: variable '__network_connections_result' from source: set_fact 12033 1726867223.36849: handler run complete 12033 1726867223.36862: attempt loop complete, returning result 12033 1726867223.36865: _execute() done 12033 1726867223.36867: dumping result to json 12033 1726867223.36870: done dumping result, returning 12033 1726867223.36879: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-74bb-502b-000000000e1e] 12033 1726867223.36883: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e1e 12033 1726867223.36987: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e1e 12033 1726867223.36990: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 12033 1726867223.37085: no more pending results, returning what we have 12033 1726867223.37088: results queue empty 12033 1726867223.37089: checking for any_errors_fatal 12033 1726867223.37094: done checking for any_errors_fatal 12033 1726867223.37094: checking for max_fail_percentage 12033 1726867223.37097: done checking for max_fail_percentage 12033 1726867223.37097: checking to see if all hosts have failed and the running result is not ok 12033 1726867223.37098: done checking to see if all hosts have failed 12033 1726867223.37099: getting the remaining hosts for this loop 12033 1726867223.37100: done getting the remaining hosts for this loop 12033 1726867223.37105: getting the next task for host managed_node3 12033 1726867223.37111: done getting next task for host managed_node3 12033 1726867223.37116: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12033 1726867223.37120: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867223.37132: getting variables 12033 1726867223.37133: in VariableManager get_vars() 12033 1726867223.37165: Calling all_inventory to load vars for managed_node3 12033 1726867223.37167: Calling groups_inventory to load vars for managed_node3 12033 1726867223.37168: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867223.37174: Calling all_plugins_play to load vars for managed_node3 12033 1726867223.37176: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867223.37180: Calling groups_plugins_play to load vars for managed_node3 12033 1726867223.38429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867223.39852: done with get_vars() 12033 1726867223.39873: done getting variables 12033 1726867223.39935: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:20:23 -0400 (0:00:00.049) 0:01:02.515 ****** 12033 1726867223.39974: entering _queue_task() for managed_node3/debug 12033 1726867223.40234: worker is 1 (out of 1 available) 12033 1726867223.40246: exiting _queue_task() for managed_node3/debug 12033 1726867223.40258: done queuing things up, now waiting for results queue to drain 12033 1726867223.40260: waiting for pending results... 12033 1726867223.40444: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12033 1726867223.40548: in run() - task 0affcac9-a3a5-74bb-502b-000000000e1f 12033 1726867223.40567: variable 'ansible_search_path' from source: unknown 12033 1726867223.40571: variable 'ansible_search_path' from source: unknown 12033 1726867223.40601: calling self._execute() 12033 1726867223.40670: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867223.40701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867223.40705: variable 'omit' from source: magic vars 12033 1726867223.40969: variable 'ansible_distribution_major_version' from source: facts 12033 1726867223.40981: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867223.40987: variable 'omit' from source: magic vars 12033 1726867223.41038: variable 'omit' from source: magic vars 12033 1726867223.41062: variable 'omit' from source: magic vars 12033 1726867223.41094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867223.41125: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867223.41140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867223.41153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867223.41163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867223.41188: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867223.41191: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867223.41194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867223.41267: Set connection var ansible_pipelining to False 12033 1726867223.41273: Set connection var ansible_shell_executable to /bin/sh 12033 1726867223.41282: Set connection var ansible_timeout to 10 12033 1726867223.41287: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867223.41289: Set connection var ansible_connection to ssh 12033 1726867223.41294: Set connection var ansible_shell_type to sh 12033 1726867223.41314: variable 'ansible_shell_executable' from source: unknown 12033 1726867223.41317: variable 'ansible_connection' from source: unknown 12033 1726867223.41321: variable 'ansible_module_compression' from source: unknown 12033 1726867223.41324: variable 'ansible_shell_type' from source: unknown 12033 1726867223.41326: variable 'ansible_shell_executable' from source: unknown 12033 1726867223.41328: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867223.41330: variable 'ansible_pipelining' from source: unknown 12033 1726867223.41332: variable 'ansible_timeout' from source: unknown 12033 1726867223.41335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867223.41433: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867223.41442: variable 'omit' from source: magic vars 12033 1726867223.41447: starting attempt loop 12033 1726867223.41458: running the handler 12033 1726867223.41499: variable '__network_connections_result' from source: set_fact 12033 1726867223.41554: variable '__network_connections_result' from source: set_fact 12033 1726867223.41643: handler run complete 12033 1726867223.41661: attempt loop complete, returning result 12033 1726867223.41665: _execute() done 12033 1726867223.41669: dumping result to json 12033 1726867223.41671: done dumping result, returning 12033 1726867223.41684: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-74bb-502b-000000000e1f] 12033 1726867223.41686: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e1f 12033 1726867223.41775: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e1f 12033 1726867223.41781: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 12033 1726867223.41865: no more pending results, returning what we have 12033 1726867223.41868: results queue empty 12033 1726867223.41869: checking for any_errors_fatal 12033 1726867223.41873: done checking for any_errors_fatal 12033 1726867223.41874: checking for max_fail_percentage 12033 1726867223.41875: done checking for max_fail_percentage 12033 1726867223.41876: checking to see if all hosts have failed and the running result is not ok 12033 1726867223.41878: done checking to see if all hosts have failed 12033 1726867223.41879: getting the remaining hosts for this loop 12033 1726867223.41881: done getting the remaining hosts for this loop 12033 1726867223.41884: getting the next task for host managed_node3 12033 1726867223.41890: done getting next task for host managed_node3 12033 1726867223.41893: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12033 1726867223.41898: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867223.41909: getting variables 12033 1726867223.41911: in VariableManager get_vars() 12033 1726867223.41950: Calling all_inventory to load vars for managed_node3 12033 1726867223.42001: Calling groups_inventory to load vars for managed_node3 12033 1726867223.42007: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867223.42015: Calling all_plugins_play to load vars for managed_node3 12033 1726867223.42018: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867223.42022: Calling groups_plugins_play to load vars for managed_node3 12033 1726867223.43129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867223.44001: done with get_vars() 12033 1726867223.44018: done getting variables 12033 1726867223.44057: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:20:23 -0400 (0:00:00.041) 0:01:02.556 ****** 12033 1726867223.44082: entering _queue_task() for managed_node3/debug 12033 1726867223.44269: worker is 1 (out of 1 available) 12033 1726867223.44284: exiting _queue_task() for managed_node3/debug 12033 1726867223.44296: done queuing things up, now waiting for results queue to drain 12033 1726867223.44297: waiting for pending results... 12033 1726867223.44473: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12033 1726867223.44571: in run() - task 0affcac9-a3a5-74bb-502b-000000000e20 12033 1726867223.44585: variable 'ansible_search_path' from source: unknown 12033 1726867223.44589: variable 'ansible_search_path' from source: unknown 12033 1726867223.44615: calling self._execute() 12033 1726867223.44687: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867223.44691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867223.44701: variable 'omit' from source: magic vars 12033 1726867223.44962: variable 'ansible_distribution_major_version' from source: facts 12033 1726867223.44976: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867223.45059: variable 'network_state' from source: role '' defaults 12033 1726867223.45072: Evaluated conditional (network_state != {}): False 12033 1726867223.45075: when evaluation is False, skipping this task 12033 1726867223.45080: _execute() done 12033 1726867223.45083: dumping result to json 12033 1726867223.45085: done dumping result, returning 12033 1726867223.45091: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-74bb-502b-000000000e20] 12033 1726867223.45096: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e20 12033 1726867223.45185: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e20 12033 1726867223.45188: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 12033 1726867223.45236: no more pending results, returning what we have 12033 1726867223.45239: results queue empty 12033 1726867223.45240: checking for any_errors_fatal 12033 1726867223.45248: done checking for any_errors_fatal 12033 1726867223.45249: checking for max_fail_percentage 12033 1726867223.45251: done checking for max_fail_percentage 12033 1726867223.45252: checking to see if all hosts have failed and the running result is not ok 12033 1726867223.45252: done checking to see if all hosts have failed 12033 1726867223.45253: getting the remaining hosts for this loop 12033 1726867223.45254: done getting the remaining hosts for this loop 12033 1726867223.45257: getting the next task for host managed_node3 12033 1726867223.45264: done getting next task for host managed_node3 12033 1726867223.45267: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12033 1726867223.45272: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867223.45291: getting variables 12033 1726867223.45292: in VariableManager get_vars() 12033 1726867223.45326: Calling all_inventory to load vars for managed_node3 12033 1726867223.45328: Calling groups_inventory to load vars for managed_node3 12033 1726867223.45330: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867223.45338: Calling all_plugins_play to load vars for managed_node3 12033 1726867223.45340: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867223.45346: Calling groups_plugins_play to load vars for managed_node3 12033 1726867223.46196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867223.47047: done with get_vars() 12033 1726867223.47062: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:20:23 -0400 (0:00:00.030) 0:01:02.587 ****** 12033 1726867223.47129: entering _queue_task() for managed_node3/ping 12033 1726867223.47316: worker is 1 (out of 1 available) 12033 1726867223.47328: exiting _queue_task() for managed_node3/ping 12033 1726867223.47339: done queuing things up, now waiting for results queue to drain 12033 1726867223.47340: waiting for pending results... 12033 1726867223.47515: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 12033 1726867223.47618: in run() - task 0affcac9-a3a5-74bb-502b-000000000e21 12033 1726867223.47629: variable 'ansible_search_path' from source: unknown 12033 1726867223.47632: variable 'ansible_search_path' from source: unknown 12033 1726867223.47659: calling self._execute() 12033 1726867223.47729: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867223.47733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867223.47741: variable 'omit' from source: magic vars 12033 1726867223.48009: variable 'ansible_distribution_major_version' from source: facts 12033 1726867223.48019: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867223.48025: variable 'omit' from source: magic vars 12033 1726867223.48075: variable 'omit' from source: magic vars 12033 1726867223.48100: variable 'omit' from source: magic vars 12033 1726867223.48133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867223.48158: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867223.48173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867223.48187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867223.48197: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867223.48224: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867223.48228: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867223.48230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867223.48294: Set connection var ansible_pipelining to False 12033 1726867223.48300: Set connection var ansible_shell_executable to /bin/sh 12033 1726867223.48308: Set connection var ansible_timeout to 10 12033 1726867223.48313: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867223.48315: Set connection var ansible_connection to ssh 12033 1726867223.48322: Set connection var ansible_shell_type to sh 12033 1726867223.48339: variable 'ansible_shell_executable' from source: unknown 12033 1726867223.48342: variable 'ansible_connection' from source: unknown 12033 1726867223.48345: variable 'ansible_module_compression' from source: unknown 12033 1726867223.48347: variable 'ansible_shell_type' from source: unknown 12033 1726867223.48350: variable 'ansible_shell_executable' from source: unknown 12033 1726867223.48352: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867223.48354: variable 'ansible_pipelining' from source: unknown 12033 1726867223.48358: variable 'ansible_timeout' from source: unknown 12033 1726867223.48362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867223.48506: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12033 1726867223.48512: variable 'omit' from source: magic vars 12033 1726867223.48517: starting attempt loop 12033 1726867223.48520: running the handler 12033 1726867223.48532: _low_level_execute_command(): starting 12033 1726867223.48538: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867223.49018: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867223.49053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867223.49056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867223.49059: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867223.49061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867223.49112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867223.49115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867223.49119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867223.49180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867223.50879: stdout chunk (state=3): >>>/root <<< 12033 1726867223.50982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867223.51006: stderr chunk (state=3): >>><<< 12033 1726867223.51011: stdout chunk (state=3): >>><<< 12033 1726867223.51030: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867223.51041: _low_level_execute_command(): starting 12033 1726867223.51049: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867223.5102973-14957-96120465668715 `" && echo ansible-tmp-1726867223.5102973-14957-96120465668715="` echo /root/.ansible/tmp/ansible-tmp-1726867223.5102973-14957-96120465668715 `" ) && sleep 0' 12033 1726867223.51457: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867223.51461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867223.51470: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867223.51472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867223.51474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867223.51522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867223.51527: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867223.51569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867223.53523: stdout chunk (state=3): >>>ansible-tmp-1726867223.5102973-14957-96120465668715=/root/.ansible/tmp/ansible-tmp-1726867223.5102973-14957-96120465668715 <<< 12033 1726867223.53637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867223.53662: stderr chunk (state=3): >>><<< 12033 1726867223.53665: stdout chunk (state=3): >>><<< 12033 1726867223.53680: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867223.5102973-14957-96120465668715=/root/.ansible/tmp/ansible-tmp-1726867223.5102973-14957-96120465668715 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867223.53715: variable 'ansible_module_compression' from source: unknown 12033 1726867223.53747: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 12033 1726867223.53775: variable 'ansible_facts' from source: unknown 12033 1726867223.53832: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867223.5102973-14957-96120465668715/AnsiballZ_ping.py 12033 1726867223.53922: Sending initial data 12033 1726867223.53925: Sent initial data (152 bytes) 12033 1726867223.54341: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867223.54344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867223.54347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867223.54349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867223.54401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867223.54406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867223.54449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867223.56020: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867223.56056: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867223.56099: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpdxv626gk /root/.ansible/tmp/ansible-tmp-1726867223.5102973-14957-96120465668715/AnsiballZ_ping.py <<< 12033 1726867223.56111: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867223.5102973-14957-96120465668715/AnsiballZ_ping.py" <<< 12033 1726867223.56144: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpdxv626gk" to remote "/root/.ansible/tmp/ansible-tmp-1726867223.5102973-14957-96120465668715/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867223.5102973-14957-96120465668715/AnsiballZ_ping.py" <<< 12033 1726867223.56686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867223.56721: stderr chunk (state=3): >>><<< 12033 1726867223.56724: stdout chunk (state=3): >>><<< 12033 1726867223.56740: done transferring module to remote 12033 1726867223.56747: _low_level_execute_command(): starting 12033 1726867223.56751: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867223.5102973-14957-96120465668715/ /root/.ansible/tmp/ansible-tmp-1726867223.5102973-14957-96120465668715/AnsiballZ_ping.py && sleep 0' 12033 1726867223.57168: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867223.57171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867223.57174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867223.57176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867223.57183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867223.57229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867223.57232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867223.57281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867223.59024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867223.59047: stderr chunk (state=3): >>><<< 12033 1726867223.59050: stdout chunk (state=3): >>><<< 12033 1726867223.59061: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867223.59064: _low_level_execute_command(): starting 12033 1726867223.59069: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867223.5102973-14957-96120465668715/AnsiballZ_ping.py && sleep 0' 12033 1726867223.59465: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867223.59468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867223.59470: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867223.59472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867223.59524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867223.59531: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867223.59579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867223.74803: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12033 1726867223.75847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867223.75869: stderr chunk (state=3): >>><<< 12033 1726867223.75879: stdout chunk (state=3): >>><<< 12033 1726867223.75907: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867223.75939: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867223.5102973-14957-96120465668715/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867223.75953: _low_level_execute_command(): starting 12033 1726867223.75961: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867223.5102973-14957-96120465668715/ > /dev/null 2>&1 && sleep 0' 12033 1726867223.76552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867223.76564: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867223.76576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867223.76597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867223.76617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867223.76627: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867223.76640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867223.76657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867223.76690: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867223.76707: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867223.76720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867223.76732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867223.76748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867223.76760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867223.76843: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867223.76868: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867223.76886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867223.76967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867223.78829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867223.79113: stdout chunk (state=3): >>><<< 12033 1726867223.79116: stderr chunk (state=3): >>><<< 12033 1726867223.79119: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867223.79121: handler run complete 12033 1726867223.79123: attempt loop complete, returning result 12033 1726867223.79125: _execute() done 12033 1726867223.79127: dumping result to json 12033 1726867223.79129: done dumping result, returning 12033 1726867223.79131: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-74bb-502b-000000000e21] 12033 1726867223.79133: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e21 12033 1726867223.79483: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e21 12033 1726867223.79487: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 12033 1726867223.79558: no more pending results, returning what we have 12033 1726867223.79562: results queue empty 12033 1726867223.79563: checking for any_errors_fatal 12033 1726867223.79570: done checking for any_errors_fatal 12033 1726867223.79571: checking for max_fail_percentage 12033 1726867223.79573: done checking for max_fail_percentage 12033 1726867223.79574: checking to see if all hosts have failed and the running result is not ok 12033 1726867223.79575: done checking to see if all hosts have failed 12033 1726867223.79575: getting the remaining hosts for this loop 12033 1726867223.79580: done getting the remaining hosts for this loop 12033 1726867223.79584: getting the next task for host managed_node3 12033 1726867223.79596: done getting next task for host managed_node3 12033 1726867223.79598: ^ task is: TASK: meta (role_complete) 12033 1726867223.79606: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867223.79620: getting variables 12033 1726867223.79621: in VariableManager get_vars() 12033 1726867223.80279: Calling all_inventory to load vars for managed_node3 12033 1726867223.80283: Calling groups_inventory to load vars for managed_node3 12033 1726867223.80287: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867223.80297: Calling all_plugins_play to load vars for managed_node3 12033 1726867223.80300: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867223.80305: Calling groups_plugins_play to load vars for managed_node3 12033 1726867223.83127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867223.86021: done with get_vars() 12033 1726867223.86051: done getting variables 12033 1726867223.86132: done queuing things up, now waiting for results queue to drain 12033 1726867223.86134: results queue empty 12033 1726867223.86134: checking for any_errors_fatal 12033 1726867223.86137: done checking for any_errors_fatal 12033 1726867223.86138: checking for max_fail_percentage 12033 1726867223.86139: done checking for max_fail_percentage 12033 1726867223.86140: checking to see if all hosts have failed and the running result is not ok 12033 1726867223.86140: done checking to see if all hosts have failed 12033 1726867223.86141: getting the remaining hosts for this loop 12033 1726867223.86142: done getting the remaining hosts for this loop 12033 1726867223.86145: getting the next task for host managed_node3 12033 1726867223.86149: done getting next task for host managed_node3 12033 1726867223.86151: ^ task is: TASK: Delete the device '{{ controller_device }}' 12033 1726867223.86153: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867223.86156: getting variables 12033 1726867223.86156: in VariableManager get_vars() 12033 1726867223.86173: Calling all_inventory to load vars for managed_node3 12033 1726867223.86175: Calling groups_inventory to load vars for managed_node3 12033 1726867223.86176: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867223.86185: Calling all_plugins_play to load vars for managed_node3 12033 1726867223.86187: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867223.86189: Calling groups_plugins_play to load vars for managed_node3 12033 1726867223.88740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867223.92128: done with get_vars() 12033 1726867223.92154: done getting variables 12033 1726867223.92308: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12033 1726867223.92427: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml:22 Friday 20 September 2024 17:20:23 -0400 (0:00:00.453) 0:01:03.040 ****** 12033 1726867223.92460: entering _queue_task() for managed_node3/command 12033 1726867223.93416: worker is 1 (out of 1 available) 12033 1726867223.93431: exiting _queue_task() for managed_node3/command 12033 1726867223.93445: done queuing things up, now waiting for results queue to drain 12033 1726867223.93446: waiting for pending results... 12033 1726867223.93945: running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' 12033 1726867223.94649: in run() - task 0affcac9-a3a5-74bb-502b-000000000e51 12033 1726867223.94664: variable 'ansible_search_path' from source: unknown 12033 1726867223.94667: variable 'ansible_search_path' from source: unknown 12033 1726867223.94704: calling self._execute() 12033 1726867223.95015: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867223.95023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867223.95033: variable 'omit' from source: magic vars 12033 1726867223.95981: variable 'ansible_distribution_major_version' from source: facts 12033 1726867223.96182: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867223.96186: variable 'omit' from source: magic vars 12033 1726867223.96189: variable 'omit' from source: magic vars 12033 1726867223.96192: variable 'controller_device' from source: play vars 12033 1726867223.96195: variable 'omit' from source: magic vars 12033 1726867223.96197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867223.96583: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867223.96586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867223.96589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867223.96591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867223.96593: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867223.96594: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867223.96596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867223.96598: Set connection var ansible_pipelining to False 12033 1726867223.96599: Set connection var ansible_shell_executable to /bin/sh 12033 1726867223.96791: Set connection var ansible_timeout to 10 12033 1726867223.96801: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867223.96808: Set connection var ansible_connection to ssh 12033 1726867223.96816: Set connection var ansible_shell_type to sh 12033 1726867223.96839: variable 'ansible_shell_executable' from source: unknown 12033 1726867223.96846: variable 'ansible_connection' from source: unknown 12033 1726867223.96853: variable 'ansible_module_compression' from source: unknown 12033 1726867223.96859: variable 'ansible_shell_type' from source: unknown 12033 1726867223.96865: variable 'ansible_shell_executable' from source: unknown 12033 1726867223.96870: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867223.96876: variable 'ansible_pipelining' from source: unknown 12033 1726867223.96885: variable 'ansible_timeout' from source: unknown 12033 1726867223.96892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867223.97235: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867223.97253: variable 'omit' from source: magic vars 12033 1726867223.97263: starting attempt loop 12033 1726867223.97269: running the handler 12033 1726867223.97291: _low_level_execute_command(): starting 12033 1726867223.97304: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867223.98690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867223.98732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867223.98747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867223.98757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867223.98962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867223.98975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867223.99047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867224.00722: stdout chunk (state=3): >>>/root <<< 12033 1726867224.01015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867224.01020: stdout chunk (state=3): >>><<< 12033 1726867224.01022: stderr chunk (state=3): >>><<< 12033 1726867224.01126: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867224.01129: _low_level_execute_command(): starting 12033 1726867224.01132: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867224.0104346-14977-63774633716995 `" && echo ansible-tmp-1726867224.0104346-14977-63774633716995="` echo /root/.ansible/tmp/ansible-tmp-1726867224.0104346-14977-63774633716995 `" ) && sleep 0' 12033 1726867224.02101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867224.02330: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867224.02355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867224.02380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867224.02461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867224.04351: stdout chunk (state=3): >>>ansible-tmp-1726867224.0104346-14977-63774633716995=/root/.ansible/tmp/ansible-tmp-1726867224.0104346-14977-63774633716995 <<< 12033 1726867224.04493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867224.04504: stdout chunk (state=3): >>><<< 12033 1726867224.04523: stderr chunk (state=3): >>><<< 12033 1726867224.04541: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867224.0104346-14977-63774633716995=/root/.ansible/tmp/ansible-tmp-1726867224.0104346-14977-63774633716995 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867224.04579: variable 'ansible_module_compression' from source: unknown 12033 1726867224.04703: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867224.04906: variable 'ansible_facts' from source: unknown 12033 1726867224.05001: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867224.0104346-14977-63774633716995/AnsiballZ_command.py 12033 1726867224.05356: Sending initial data 12033 1726867224.05366: Sent initial data (155 bytes) 12033 1726867224.06555: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867224.06567: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867224.06615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867224.06626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867224.07295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867224.08411: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12033 1726867224.08433: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867224.08503: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867224.08544: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmppmnzdgg7 /root/.ansible/tmp/ansible-tmp-1726867224.0104346-14977-63774633716995/AnsiballZ_command.py <<< 12033 1726867224.08554: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867224.0104346-14977-63774633716995/AnsiballZ_command.py" <<< 12033 1726867224.08619: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmppmnzdgg7" to remote "/root/.ansible/tmp/ansible-tmp-1726867224.0104346-14977-63774633716995/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867224.0104346-14977-63774633716995/AnsiballZ_command.py" <<< 12033 1726867224.09822: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867224.09926: stderr chunk (state=3): >>><<< 12033 1726867224.09929: stdout chunk (state=3): >>><<< 12033 1726867224.09931: done transferring module to remote 12033 1726867224.09933: _low_level_execute_command(): starting 12033 1726867224.09936: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867224.0104346-14977-63774633716995/ /root/.ansible/tmp/ansible-tmp-1726867224.0104346-14977-63774633716995/AnsiballZ_command.py && sleep 0' 12033 1726867224.10990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867224.11004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867224.11094: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867224.11293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867224.11363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867224.13184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867224.13187: stdout chunk (state=3): >>><<< 12033 1726867224.13193: stderr chunk (state=3): >>><<< 12033 1726867224.13213: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867224.13227: _low_level_execute_command(): starting 12033 1726867224.13237: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867224.0104346-14977-63774633716995/AnsiballZ_command.py && sleep 0' 12033 1726867224.14043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867224.14059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867224.14101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867224.14125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867224.14199: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867224.14234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867224.14257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867224.14272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867224.14397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867224.30295: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 17:20:24.293337", "end": "2024-09-20 17:20:24.300516", "delta": "0:00:00.007179", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867224.31762: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. <<< 12033 1726867224.31766: stdout chunk (state=3): >>><<< 12033 1726867224.31769: stderr chunk (state=3): >>><<< 12033 1726867224.31913: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 17:20:24.293337", "end": "2024-09-20 17:20:24.300516", "delta": "0:00:00.007179", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.68 closed. 12033 1726867224.31918: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867224.0104346-14977-63774633716995/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867224.31920: _low_level_execute_command(): starting 12033 1726867224.31922: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867224.0104346-14977-63774633716995/ > /dev/null 2>&1 && sleep 0' 12033 1726867224.32433: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867224.32446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867224.32465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867224.32490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867224.32586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867224.32610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867224.32691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867224.34511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867224.34560: stderr chunk (state=3): >>><<< 12033 1726867224.34570: stdout chunk (state=3): >>><<< 12033 1726867224.34594: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867224.34612: handler run complete 12033 1726867224.34640: Evaluated conditional (False): False 12033 1726867224.34782: Evaluated conditional (False): False 12033 1726867224.34785: attempt loop complete, returning result 12033 1726867224.34788: _execute() done 12033 1726867224.34790: dumping result to json 12033 1726867224.34792: done dumping result, returning 12033 1726867224.34794: done running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' [0affcac9-a3a5-74bb-502b-000000000e51] 12033 1726867224.34797: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e51 12033 1726867224.34874: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e51 12033 1726867224.34880: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007179", "end": "2024-09-20 17:20:24.300516", "failed_when_result": false, "rc": 1, "start": "2024-09-20 17:20:24.293337" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 12033 1726867224.34950: no more pending results, returning what we have 12033 1726867224.34954: results queue empty 12033 1726867224.34956: checking for any_errors_fatal 12033 1726867224.34958: done checking for any_errors_fatal 12033 1726867224.34958: checking for max_fail_percentage 12033 1726867224.34961: done checking for max_fail_percentage 12033 1726867224.34962: checking to see if all hosts have failed and the running result is not ok 12033 1726867224.34963: done checking to see if all hosts have failed 12033 1726867224.34964: getting the remaining hosts for this loop 12033 1726867224.34966: done getting the remaining hosts for this loop 12033 1726867224.34970: getting the next task for host managed_node3 12033 1726867224.34986: done getting next task for host managed_node3 12033 1726867224.34989: ^ task is: TASK: Remove test interfaces 12033 1726867224.34993: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867224.34997: getting variables 12033 1726867224.34999: in VariableManager get_vars() 12033 1726867224.35047: Calling all_inventory to load vars for managed_node3 12033 1726867224.35050: Calling groups_inventory to load vars for managed_node3 12033 1726867224.35052: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867224.35063: Calling all_plugins_play to load vars for managed_node3 12033 1726867224.35065: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867224.35067: Calling groups_plugins_play to load vars for managed_node3 12033 1726867224.36675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867224.38567: done with get_vars() 12033 1726867224.38962: done getting variables 12033 1726867224.39025: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 17:20:24 -0400 (0:00:00.465) 0:01:03.506 ****** 12033 1726867224.39056: entering _queue_task() for managed_node3/shell 12033 1726867224.39373: worker is 1 (out of 1 available) 12033 1726867224.39388: exiting _queue_task() for managed_node3/shell 12033 1726867224.39399: done queuing things up, now waiting for results queue to drain 12033 1726867224.39401: waiting for pending results... 12033 1726867224.39793: running TaskExecutor() for managed_node3/TASK: Remove test interfaces 12033 1726867224.39801: in run() - task 0affcac9-a3a5-74bb-502b-000000000e57 12033 1726867224.39823: variable 'ansible_search_path' from source: unknown 12033 1726867224.39831: variable 'ansible_search_path' from source: unknown 12033 1726867224.39917: calling self._execute() 12033 1726867224.39970: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867224.39984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867224.39998: variable 'omit' from source: magic vars 12033 1726867224.40375: variable 'ansible_distribution_major_version' from source: facts 12033 1726867224.40395: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867224.40407: variable 'omit' from source: magic vars 12033 1726867224.40460: variable 'omit' from source: magic vars 12033 1726867224.40680: variable 'dhcp_interface1' from source: play vars 12033 1726867224.40684: variable 'dhcp_interface2' from source: play vars 12033 1726867224.40687: variable 'omit' from source: magic vars 12033 1726867224.40689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867224.40732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867224.40757: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867224.40784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867224.40802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867224.40840: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867224.40851: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867224.40858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867224.40966: Set connection var ansible_pipelining to False 12033 1726867224.40984: Set connection var ansible_shell_executable to /bin/sh 12033 1726867224.41001: Set connection var ansible_timeout to 10 12033 1726867224.41012: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867224.41019: Set connection var ansible_connection to ssh 12033 1726867224.41182: Set connection var ansible_shell_type to sh 12033 1726867224.41185: variable 'ansible_shell_executable' from source: unknown 12033 1726867224.41188: variable 'ansible_connection' from source: unknown 12033 1726867224.41190: variable 'ansible_module_compression' from source: unknown 12033 1726867224.41192: variable 'ansible_shell_type' from source: unknown 12033 1726867224.41194: variable 'ansible_shell_executable' from source: unknown 12033 1726867224.41196: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867224.41198: variable 'ansible_pipelining' from source: unknown 12033 1726867224.41199: variable 'ansible_timeout' from source: unknown 12033 1726867224.41202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867224.41289: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867224.41309: variable 'omit' from source: magic vars 12033 1726867224.41325: starting attempt loop 12033 1726867224.41332: running the handler 12033 1726867224.41347: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867224.41371: _low_level_execute_command(): starting 12033 1726867224.41387: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867224.42470: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867224.42544: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867224.42598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867224.42614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867224.42646: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867224.42725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867224.44440: stdout chunk (state=3): >>>/root <<< 12033 1726867224.44605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867224.44609: stdout chunk (state=3): >>><<< 12033 1726867224.44612: stderr chunk (state=3): >>><<< 12033 1726867224.44641: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867224.44745: _low_level_execute_command(): starting 12033 1726867224.44749: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867224.4464734-14997-15938156130976 `" && echo ansible-tmp-1726867224.4464734-14997-15938156130976="` echo /root/.ansible/tmp/ansible-tmp-1726867224.4464734-14997-15938156130976 `" ) && sleep 0' 12033 1726867224.45331: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867224.45344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867224.45368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867224.45428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867224.45454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867224.45528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867224.45581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867224.45607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867224.45667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867224.45701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867224.47628: stdout chunk (state=3): >>>ansible-tmp-1726867224.4464734-14997-15938156130976=/root/.ansible/tmp/ansible-tmp-1726867224.4464734-14997-15938156130976 <<< 12033 1726867224.47783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867224.47786: stdout chunk (state=3): >>><<< 12033 1726867224.47789: stderr chunk (state=3): >>><<< 12033 1726867224.47814: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867224.4464734-14997-15938156130976=/root/.ansible/tmp/ansible-tmp-1726867224.4464734-14997-15938156130976 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867224.47983: variable 'ansible_module_compression' from source: unknown 12033 1726867224.47986: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867224.47989: variable 'ansible_facts' from source: unknown 12033 1726867224.48051: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867224.4464734-14997-15938156130976/AnsiballZ_command.py 12033 1726867224.48231: Sending initial data 12033 1726867224.48241: Sent initial data (155 bytes) 12033 1726867224.48833: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867224.48849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867224.48873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867224.48993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867224.49008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867224.49025: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867224.49047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867224.49136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867224.50705: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867224.50763: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867224.50834: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmp7oj_j4_c /root/.ansible/tmp/ansible-tmp-1726867224.4464734-14997-15938156130976/AnsiballZ_command.py <<< 12033 1726867224.50857: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867224.4464734-14997-15938156130976/AnsiballZ_command.py" <<< 12033 1726867224.50895: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmp7oj_j4_c" to remote "/root/.ansible/tmp/ansible-tmp-1726867224.4464734-14997-15938156130976/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867224.4464734-14997-15938156130976/AnsiballZ_command.py" <<< 12033 1726867224.51635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867224.51668: stderr chunk (state=3): >>><<< 12033 1726867224.51683: stdout chunk (state=3): >>><<< 12033 1726867224.51732: done transferring module to remote 12033 1726867224.51735: _low_level_execute_command(): starting 12033 1726867224.51741: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867224.4464734-14997-15938156130976/ /root/.ansible/tmp/ansible-tmp-1726867224.4464734-14997-15938156130976/AnsiballZ_command.py && sleep 0' 12033 1726867224.52497: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867224.52615: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867224.52629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867224.52719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867224.54527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867224.54530: stdout chunk (state=3): >>><<< 12033 1726867224.54537: stderr chunk (state=3): >>><<< 12033 1726867224.54554: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867224.54557: _low_level_execute_command(): starting 12033 1726867224.54562: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867224.4464734-14997-15938156130976/AnsiballZ_command.py && sleep 0' 12033 1726867224.55164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867224.55221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867224.55229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867224.55231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867224.55234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867224.55236: stderr chunk (state=3): >>>debug2: match not found <<< 12033 1726867224.55238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867224.55251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12033 1726867224.55258: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address <<< 12033 1726867224.55265: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12033 1726867224.55274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867224.55286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867224.55338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867224.55341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867224.55344: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867224.55390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867224.55405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867224.55419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867224.55497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867224.74491: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 17:20:24.705840", "end": "2024-09-20 17:20:24.742232", "delta": "0:00:00.036392", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867224.76457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867224.76461: stdout chunk (state=3): >>><<< 12033 1726867224.76463: stderr chunk (state=3): >>><<< 12033 1726867224.76466: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 17:20:24.705840", "end": "2024-09-20 17:20:24.742232", "delta": "0:00:00.036392", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867224.76472: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867224.4464734-14997-15938156130976/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867224.76474: _low_level_execute_command(): starting 12033 1726867224.76479: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867224.4464734-14997-15938156130976/ > /dev/null 2>&1 && sleep 0' 12033 1726867224.77397: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867224.77447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867224.77699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867224.77757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867224.79571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867224.79593: stderr chunk (state=3): >>><<< 12033 1726867224.79785: stdout chunk (state=3): >>><<< 12033 1726867224.79789: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867224.79791: handler run complete 12033 1726867224.79794: Evaluated conditional (False): False 12033 1726867224.79796: attempt loop complete, returning result 12033 1726867224.79798: _execute() done 12033 1726867224.79800: dumping result to json 12033 1726867224.79805: done dumping result, returning 12033 1726867224.79807: done running TaskExecutor() for managed_node3/TASK: Remove test interfaces [0affcac9-a3a5-74bb-502b-000000000e57] 12033 1726867224.79809: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e57 ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.036392", "end": "2024-09-20 17:20:24.742232", "rc": 0, "start": "2024-09-20 17:20:24.705840" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 12033 1726867224.80158: no more pending results, returning what we have 12033 1726867224.80162: results queue empty 12033 1726867224.80163: checking for any_errors_fatal 12033 1726867224.80381: done checking for any_errors_fatal 12033 1726867224.80383: checking for max_fail_percentage 12033 1726867224.80385: done checking for max_fail_percentage 12033 1726867224.80387: checking to see if all hosts have failed and the running result is not ok 12033 1726867224.80388: done checking to see if all hosts have failed 12033 1726867224.80388: getting the remaining hosts for this loop 12033 1726867224.80390: done getting the remaining hosts for this loop 12033 1726867224.80394: getting the next task for host managed_node3 12033 1726867224.80405: done getting next task for host managed_node3 12033 1726867224.80409: ^ task is: TASK: Stop dnsmasq/radvd services 12033 1726867224.80413: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867224.80417: getting variables 12033 1726867224.80418: in VariableManager get_vars() 12033 1726867224.80470: Calling all_inventory to load vars for managed_node3 12033 1726867224.80474: Calling groups_inventory to load vars for managed_node3 12033 1726867224.80476: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867224.80598: Calling all_plugins_play to load vars for managed_node3 12033 1726867224.80605: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867224.80609: Calling groups_plugins_play to load vars for managed_node3 12033 1726867224.81291: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e57 12033 1726867224.81294: WORKER PROCESS EXITING 12033 1726867224.82713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867224.84961: done with get_vars() 12033 1726867224.84987: done getting variables 12033 1726867224.85165: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 17:20:24 -0400 (0:00:00.461) 0:01:03.967 ****** 12033 1726867224.85204: entering _queue_task() for managed_node3/shell 12033 1726867224.85890: worker is 1 (out of 1 available) 12033 1726867224.85908: exiting _queue_task() for managed_node3/shell 12033 1726867224.85921: done queuing things up, now waiting for results queue to drain 12033 1726867224.86038: waiting for pending results... 12033 1726867224.86412: running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services 12033 1726867224.86698: in run() - task 0affcac9-a3a5-74bb-502b-000000000e58 12033 1726867224.86721: variable 'ansible_search_path' from source: unknown 12033 1726867224.86729: variable 'ansible_search_path' from source: unknown 12033 1726867224.87182: calling self._execute() 12033 1726867224.87187: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867224.87189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867224.87192: variable 'omit' from source: magic vars 12033 1726867224.87686: variable 'ansible_distribution_major_version' from source: facts 12033 1726867224.87982: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867224.87986: variable 'omit' from source: magic vars 12033 1726867224.87988: variable 'omit' from source: magic vars 12033 1726867224.87993: variable 'omit' from source: magic vars 12033 1726867224.88038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867224.88080: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867224.88266: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867224.88291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867224.88306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867224.88343: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867224.88351: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867224.88358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867224.88460: Set connection var ansible_pipelining to False 12033 1726867224.88605: Set connection var ansible_shell_executable to /bin/sh 12033 1726867224.88619: Set connection var ansible_timeout to 10 12033 1726867224.88629: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867224.88635: Set connection var ansible_connection to ssh 12033 1726867224.88641: Set connection var ansible_shell_type to sh 12033 1726867224.88667: variable 'ansible_shell_executable' from source: unknown 12033 1726867224.88713: variable 'ansible_connection' from source: unknown 12033 1726867224.88723: variable 'ansible_module_compression' from source: unknown 12033 1726867224.88730: variable 'ansible_shell_type' from source: unknown 12033 1726867224.88738: variable 'ansible_shell_executable' from source: unknown 12033 1726867224.88745: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867224.88754: variable 'ansible_pipelining' from source: unknown 12033 1726867224.88761: variable 'ansible_timeout' from source: unknown 12033 1726867224.88768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867224.88900: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867224.88919: variable 'omit' from source: magic vars 12033 1726867224.88930: starting attempt loop 12033 1726867224.88936: running the handler 12033 1726867224.88949: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867224.88974: _low_level_execute_command(): starting 12033 1726867224.88990: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867224.90260: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867224.90264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867224.90266: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867224.90268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867224.90333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867224.90364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867224.90407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867224.92321: stdout chunk (state=3): >>>/root <<< 12033 1726867224.92370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867224.92384: stdout chunk (state=3): >>><<< 12033 1726867224.92398: stderr chunk (state=3): >>><<< 12033 1726867224.92429: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867224.92453: _low_level_execute_command(): starting 12033 1726867224.92463: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867224.9243662-15021-70374994312795 `" && echo ansible-tmp-1726867224.9243662-15021-70374994312795="` echo /root/.ansible/tmp/ansible-tmp-1726867224.9243662-15021-70374994312795 `" ) && sleep 0' 12033 1726867224.93457: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867224.93471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12033 1726867224.93509: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867224.93570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867224.93611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867224.95598: stdout chunk (state=3): >>>ansible-tmp-1726867224.9243662-15021-70374994312795=/root/.ansible/tmp/ansible-tmp-1726867224.9243662-15021-70374994312795 <<< 12033 1726867224.95670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867224.95711: stderr chunk (state=3): >>><<< 12033 1726867224.95714: stdout chunk (state=3): >>><<< 12033 1726867224.95732: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867224.9243662-15021-70374994312795=/root/.ansible/tmp/ansible-tmp-1726867224.9243662-15021-70374994312795 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867224.95982: variable 'ansible_module_compression' from source: unknown 12033 1726867224.95985: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867224.95988: variable 'ansible_facts' from source: unknown 12033 1726867224.96147: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867224.9243662-15021-70374994312795/AnsiballZ_command.py 12033 1726867224.96549: Sending initial data 12033 1726867224.96558: Sent initial data (155 bytes) 12033 1726867224.97727: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867224.97745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867224.97762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867224.97774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867224.97791: stderr chunk (state=3): >>>debug2: match found <<< 12033 1726867224.97804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867224.97899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867224.97934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867224.99510: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867224.99605: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867224.99665: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpvhep1g1y /root/.ansible/tmp/ansible-tmp-1726867224.9243662-15021-70374994312795/AnsiballZ_command.py <<< 12033 1726867224.99675: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867224.9243662-15021-70374994312795/AnsiballZ_command.py" <<< 12033 1726867224.99712: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpvhep1g1y" to remote "/root/.ansible/tmp/ansible-tmp-1726867224.9243662-15021-70374994312795/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867224.9243662-15021-70374994312795/AnsiballZ_command.py" <<< 12033 1726867225.01486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867225.01489: stdout chunk (state=3): >>><<< 12033 1726867225.01491: stderr chunk (state=3): >>><<< 12033 1726867225.01494: done transferring module to remote 12033 1726867225.01496: _low_level_execute_command(): starting 12033 1726867225.01498: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867224.9243662-15021-70374994312795/ /root/.ansible/tmp/ansible-tmp-1726867224.9243662-15021-70374994312795/AnsiballZ_command.py && sleep 0' 12033 1726867225.02465: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867225.02482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867225.02497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867225.02514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867225.02593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867225.02726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867225.02752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867225.02774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867225.02854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867225.04671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867225.04691: stdout chunk (state=3): >>><<< 12033 1726867225.04718: stderr chunk (state=3): >>><<< 12033 1726867225.04738: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867225.04747: _low_level_execute_command(): starting 12033 1726867225.04757: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867224.9243662-15021-70374994312795/AnsiballZ_command.py && sleep 0' 12033 1726867225.05431: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867225.05470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867225.05557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867225.05560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867225.05667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867225.05699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867225.05821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867225.23631: stdout chunk (state=3): >>> <<< 12033 1726867225.23646: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 17:20:25.206448", "end": "2024-09-20 17:20:25.232049", "delta": "0:00:00.025601", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867225.25024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867225.25057: stderr chunk (state=3): >>><<< 12033 1726867225.25060: stdout chunk (state=3): >>><<< 12033 1726867225.25080: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 17:20:25.206448", "end": "2024-09-20 17:20:25.232049", "delta": "0:00:00.025601", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867225.25117: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867224.9243662-15021-70374994312795/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867225.25125: _low_level_execute_command(): starting 12033 1726867225.25130: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867224.9243662-15021-70374994312795/ > /dev/null 2>&1 && sleep 0' 12033 1726867225.25590: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867225.25593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867225.25595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867225.25598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867225.25654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867225.25658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867225.25664: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867225.25709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867225.27514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867225.27543: stderr chunk (state=3): >>><<< 12033 1726867225.27546: stdout chunk (state=3): >>><<< 12033 1726867225.27562: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867225.27568: handler run complete 12033 1726867225.27589: Evaluated conditional (False): False 12033 1726867225.27597: attempt loop complete, returning result 12033 1726867225.27599: _execute() done 12033 1726867225.27604: dumping result to json 12033 1726867225.27607: done dumping result, returning 12033 1726867225.27614: done running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services [0affcac9-a3a5-74bb-502b-000000000e58] 12033 1726867225.27619: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e58 12033 1726867225.27721: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e58 12033 1726867225.27724: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.025601", "end": "2024-09-20 17:20:25.232049", "rc": 0, "start": "2024-09-20 17:20:25.206448" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 12033 1726867225.27819: no more pending results, returning what we have 12033 1726867225.27823: results queue empty 12033 1726867225.27824: checking for any_errors_fatal 12033 1726867225.27832: done checking for any_errors_fatal 12033 1726867225.27832: checking for max_fail_percentage 12033 1726867225.27834: done checking for max_fail_percentage 12033 1726867225.27835: checking to see if all hosts have failed and the running result is not ok 12033 1726867225.27836: done checking to see if all hosts have failed 12033 1726867225.27836: getting the remaining hosts for this loop 12033 1726867225.27838: done getting the remaining hosts for this loop 12033 1726867225.27841: getting the next task for host managed_node3 12033 1726867225.27853: done getting next task for host managed_node3 12033 1726867225.27855: ^ task is: TASK: Check routes and DNS 12033 1726867225.27860: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867225.27863: getting variables 12033 1726867225.27865: in VariableManager get_vars() 12033 1726867225.27912: Calling all_inventory to load vars for managed_node3 12033 1726867225.27920: Calling groups_inventory to load vars for managed_node3 12033 1726867225.27922: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867225.27931: Calling all_plugins_play to load vars for managed_node3 12033 1726867225.27934: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867225.27936: Calling groups_plugins_play to load vars for managed_node3 12033 1726867225.28883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867225.29745: done with get_vars() 12033 1726867225.29762: done getting variables 12033 1726867225.29809: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 17:20:25 -0400 (0:00:00.446) 0:01:04.414 ****** 12033 1726867225.29834: entering _queue_task() for managed_node3/shell 12033 1726867225.30094: worker is 1 (out of 1 available) 12033 1726867225.30109: exiting _queue_task() for managed_node3/shell 12033 1726867225.30122: done queuing things up, now waiting for results queue to drain 12033 1726867225.30124: waiting for pending results... 12033 1726867225.30307: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 12033 1726867225.30384: in run() - task 0affcac9-a3a5-74bb-502b-000000000e5c 12033 1726867225.30397: variable 'ansible_search_path' from source: unknown 12033 1726867225.30400: variable 'ansible_search_path' from source: unknown 12033 1726867225.30429: calling self._execute() 12033 1726867225.30507: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867225.30511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867225.30519: variable 'omit' from source: magic vars 12033 1726867225.30790: variable 'ansible_distribution_major_version' from source: facts 12033 1726867225.30807: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867225.30810: variable 'omit' from source: magic vars 12033 1726867225.30838: variable 'omit' from source: magic vars 12033 1726867225.30863: variable 'omit' from source: magic vars 12033 1726867225.30897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867225.30928: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867225.30944: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867225.30957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867225.30967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867225.30993: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867225.30997: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867225.31000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867225.31068: Set connection var ansible_pipelining to False 12033 1726867225.31074: Set connection var ansible_shell_executable to /bin/sh 12033 1726867225.31082: Set connection var ansible_timeout to 10 12033 1726867225.31087: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867225.31090: Set connection var ansible_connection to ssh 12033 1726867225.31094: Set connection var ansible_shell_type to sh 12033 1726867225.31116: variable 'ansible_shell_executable' from source: unknown 12033 1726867225.31118: variable 'ansible_connection' from source: unknown 12033 1726867225.31123: variable 'ansible_module_compression' from source: unknown 12033 1726867225.31125: variable 'ansible_shell_type' from source: unknown 12033 1726867225.31127: variable 'ansible_shell_executable' from source: unknown 12033 1726867225.31129: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867225.31131: variable 'ansible_pipelining' from source: unknown 12033 1726867225.31134: variable 'ansible_timeout' from source: unknown 12033 1726867225.31136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867225.31235: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867225.31247: variable 'omit' from source: magic vars 12033 1726867225.31251: starting attempt loop 12033 1726867225.31253: running the handler 12033 1726867225.31264: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867225.31281: _low_level_execute_command(): starting 12033 1726867225.31288: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867225.31812: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867225.31815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867225.31819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867225.31821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867225.31876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867225.31887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867225.31890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867225.31930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867225.33524: stdout chunk (state=3): >>>/root <<< 12033 1726867225.33625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867225.33656: stderr chunk (state=3): >>><<< 12033 1726867225.33659: stdout chunk (state=3): >>><<< 12033 1726867225.33676: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867225.33689: _low_level_execute_command(): starting 12033 1726867225.33693: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867225.3367548-15045-53355635996957 `" && echo ansible-tmp-1726867225.3367548-15045-53355635996957="` echo /root/.ansible/tmp/ansible-tmp-1726867225.3367548-15045-53355635996957 `" ) && sleep 0' 12033 1726867225.34115: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867225.34126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867225.34129: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867225.34131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867225.34169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867225.34173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867225.34256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867225.36107: stdout chunk (state=3): >>>ansible-tmp-1726867225.3367548-15045-53355635996957=/root/.ansible/tmp/ansible-tmp-1726867225.3367548-15045-53355635996957 <<< 12033 1726867225.36221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867225.36238: stderr chunk (state=3): >>><<< 12033 1726867225.36242: stdout chunk (state=3): >>><<< 12033 1726867225.36254: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867225.3367548-15045-53355635996957=/root/.ansible/tmp/ansible-tmp-1726867225.3367548-15045-53355635996957 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867225.36280: variable 'ansible_module_compression' from source: unknown 12033 1726867225.36321: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867225.36350: variable 'ansible_facts' from source: unknown 12033 1726867225.36407: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867225.3367548-15045-53355635996957/AnsiballZ_command.py 12033 1726867225.36501: Sending initial data 12033 1726867225.36508: Sent initial data (155 bytes) 12033 1726867225.37064: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867225.37080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867225.37100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867225.37190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867225.38728: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867225.38783: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867225.38831: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmp8umnsua8 /root/.ansible/tmp/ansible-tmp-1726867225.3367548-15045-53355635996957/AnsiballZ_command.py <<< 12033 1726867225.38834: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867225.3367548-15045-53355635996957/AnsiballZ_command.py" <<< 12033 1726867225.38892: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmp8umnsua8" to remote "/root/.ansible/tmp/ansible-tmp-1726867225.3367548-15045-53355635996957/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867225.3367548-15045-53355635996957/AnsiballZ_command.py" <<< 12033 1726867225.39617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867225.39770: stderr chunk (state=3): >>><<< 12033 1726867225.39774: stdout chunk (state=3): >>><<< 12033 1726867225.39776: done transferring module to remote 12033 1726867225.39780: _low_level_execute_command(): starting 12033 1726867225.39783: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867225.3367548-15045-53355635996957/ /root/.ansible/tmp/ansible-tmp-1726867225.3367548-15045-53355635996957/AnsiballZ_command.py && sleep 0' 12033 1726867225.40293: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867225.40315: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867225.40330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867225.40346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867225.40361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867225.40392: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867225.40412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867225.40493: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867225.40586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867225.40704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867225.40749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867225.42625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867225.42628: stdout chunk (state=3): >>><<< 12033 1726867225.42630: stderr chunk (state=3): >>><<< 12033 1726867225.42663: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867225.42760: _low_level_execute_command(): starting 12033 1726867225.42763: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867225.3367548-15045-53355635996957/AnsiballZ_command.py && sleep 0' 12033 1726867225.43836: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867225.43839: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867225.43841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867225.43844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867225.43882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867225.43898: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867225.44005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867225.44094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867225.44139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867225.60234: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:de:45:ad:8b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.68/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3159sec preferred_lft 3159sec\n inet6 fe80::8ff:deff:fe45:ad8b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.68 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.68 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 17:20:25.591264", "end": "2024-09-20 17:20:25.599824", "delta": "0:00:00.008560", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 12033 1726867225.60265: stdout chunk (state=3): >>> <<< 12033 1726867225.61706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867225.61710: stderr chunk (state=3): >>><<< 12033 1726867225.61712: stdout chunk (state=3): >>><<< 12033 1726867225.61736: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:de:45:ad:8b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.68/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3159sec preferred_lft 3159sec\n inet6 fe80::8ff:deff:fe45:ad8b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.68 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.68 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 17:20:25.591264", "end": "2024-09-20 17:20:25.599824", "delta": "0:00:00.008560", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867225.61869: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867225.3367548-15045-53355635996957/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867225.61873: _low_level_execute_command(): starting 12033 1726867225.61875: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867225.3367548-15045-53355635996957/ > /dev/null 2>&1 && sleep 0' 12033 1726867225.62443: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867225.62450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867225.62547: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867225.62563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867225.62631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867225.64783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867225.64787: stdout chunk (state=3): >>><<< 12033 1726867225.64790: stderr chunk (state=3): >>><<< 12033 1726867225.64792: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867225.64794: handler run complete 12033 1726867225.64797: Evaluated conditional (False): False 12033 1726867225.64799: attempt loop complete, returning result 12033 1726867225.64800: _execute() done 12033 1726867225.64805: dumping result to json 12033 1726867225.64807: done dumping result, returning 12033 1726867225.64810: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [0affcac9-a3a5-74bb-502b-000000000e5c] 12033 1726867225.64811: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e5c 12033 1726867225.64896: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e5c 12033 1726867225.64900: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008560", "end": "2024-09-20 17:20:25.599824", "rc": 0, "start": "2024-09-20 17:20:25.591264" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:de:45:ad:8b brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.15.68/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3159sec preferred_lft 3159sec inet6 fe80::8ff:deff:fe45:ad8b/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.68 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.68 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 12033 1726867225.64996: no more pending results, returning what we have 12033 1726867225.65000: results queue empty 12033 1726867225.65001: checking for any_errors_fatal 12033 1726867225.65011: done checking for any_errors_fatal 12033 1726867225.65012: checking for max_fail_percentage 12033 1726867225.65015: done checking for max_fail_percentage 12033 1726867225.65016: checking to see if all hosts have failed and the running result is not ok 12033 1726867225.65017: done checking to see if all hosts have failed 12033 1726867225.65018: getting the remaining hosts for this loop 12033 1726867225.65021: done getting the remaining hosts for this loop 12033 1726867225.65025: getting the next task for host managed_node3 12033 1726867225.65035: done getting next task for host managed_node3 12033 1726867225.65039: ^ task is: TASK: Verify DNS and network connectivity 12033 1726867225.65043: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867225.65048: getting variables 12033 1726867225.65053: in VariableManager get_vars() 12033 1726867225.65328: Calling all_inventory to load vars for managed_node3 12033 1726867225.65332: Calling groups_inventory to load vars for managed_node3 12033 1726867225.65335: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867225.65345: Calling all_plugins_play to load vars for managed_node3 12033 1726867225.65349: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867225.65351: Calling groups_plugins_play to load vars for managed_node3 12033 1726867225.66999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867225.69853: done with get_vars() 12033 1726867225.69879: done getting variables 12033 1726867225.69940: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 17:20:25 -0400 (0:00:00.402) 0:01:04.816 ****** 12033 1726867225.70044: entering _queue_task() for managed_node3/shell 12033 1726867225.70634: worker is 1 (out of 1 available) 12033 1726867225.70647: exiting _queue_task() for managed_node3/shell 12033 1726867225.70658: done queuing things up, now waiting for results queue to drain 12033 1726867225.70660: waiting for pending results... 12033 1726867225.71097: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 12033 1726867225.71418: in run() - task 0affcac9-a3a5-74bb-502b-000000000e5d 12033 1726867225.71495: variable 'ansible_search_path' from source: unknown 12033 1726867225.71498: variable 'ansible_search_path' from source: unknown 12033 1726867225.71528: calling self._execute() 12033 1726867225.71651: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867225.71696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867225.71699: variable 'omit' from source: magic vars 12033 1726867225.72068: variable 'ansible_distribution_major_version' from source: facts 12033 1726867225.72091: Evaluated conditional (ansible_distribution_major_version != '6'): True 12033 1726867225.72247: variable 'ansible_facts' from source: unknown 12033 1726867225.73403: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 12033 1726867225.73406: variable 'omit' from source: magic vars 12033 1726867225.73466: variable 'omit' from source: magic vars 12033 1726867225.73658: variable 'omit' from source: magic vars 12033 1726867225.73662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12033 1726867225.73681: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12033 1726867225.73707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12033 1726867225.73732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867225.73747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12033 1726867225.73807: variable 'inventory_hostname' from source: host vars for 'managed_node3' 12033 1726867225.73896: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867225.73907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867225.74171: Set connection var ansible_pipelining to False 12033 1726867225.74226: Set connection var ansible_shell_executable to /bin/sh 12033 1726867225.74239: Set connection var ansible_timeout to 10 12033 1726867225.74248: Set connection var ansible_module_compression to ZIP_DEFLATED 12033 1726867225.74255: Set connection var ansible_connection to ssh 12033 1726867225.74264: Set connection var ansible_shell_type to sh 12033 1726867225.74299: variable 'ansible_shell_executable' from source: unknown 12033 1726867225.74307: variable 'ansible_connection' from source: unknown 12033 1726867225.74313: variable 'ansible_module_compression' from source: unknown 12033 1726867225.74323: variable 'ansible_shell_type' from source: unknown 12033 1726867225.74330: variable 'ansible_shell_executable' from source: unknown 12033 1726867225.74335: variable 'ansible_host' from source: host vars for 'managed_node3' 12033 1726867225.74342: variable 'ansible_pipelining' from source: unknown 12033 1726867225.74348: variable 'ansible_timeout' from source: unknown 12033 1726867225.74356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 12033 1726867225.74530: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867225.74553: variable 'omit' from source: magic vars 12033 1726867225.74564: starting attempt loop 12033 1726867225.74571: running the handler 12033 1726867225.74613: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12033 1726867225.74619: _low_level_execute_command(): starting 12033 1726867225.74632: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12033 1726867225.75384: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867225.75431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 <<< 12033 1726867225.75483: stderr chunk (state=3): >>>debug2: match found <<< 12033 1726867225.75489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867225.75619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867225.75657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867225.75724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867225.77427: stdout chunk (state=3): >>>/root <<< 12033 1726867225.77585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867225.77724: stderr chunk (state=3): >>><<< 12033 1726867225.77727: stdout chunk (state=3): >>><<< 12033 1726867225.77733: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867225.77736: _low_level_execute_command(): starting 12033 1726867225.77740: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867225.776319-15065-23237075871224 `" && echo ansible-tmp-1726867225.776319-15065-23237075871224="` echo /root/.ansible/tmp/ansible-tmp-1726867225.776319-15065-23237075871224 `" ) && sleep 0' 12033 1726867225.78860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867225.78992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867225.79097: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867225.79127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867225.79131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867225.79287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867225.81091: stdout chunk (state=3): >>>ansible-tmp-1726867225.776319-15065-23237075871224=/root/.ansible/tmp/ansible-tmp-1726867225.776319-15065-23237075871224 <<< 12033 1726867225.81195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867225.81216: stderr chunk (state=3): >>><<< 12033 1726867225.81219: stdout chunk (state=3): >>><<< 12033 1726867225.81237: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867225.776319-15065-23237075871224=/root/.ansible/tmp/ansible-tmp-1726867225.776319-15065-23237075871224 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867225.81263: variable 'ansible_module_compression' from source: unknown 12033 1726867225.81302: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-1203351msjvvn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12033 1726867225.81332: variable 'ansible_facts' from source: unknown 12033 1726867225.81391: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867225.776319-15065-23237075871224/AnsiballZ_command.py 12033 1726867225.81484: Sending initial data 12033 1726867225.81487: Sent initial data (154 bytes) 12033 1726867225.82153: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867225.82156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867225.82204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867225.83738: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 12033 1726867225.83742: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12033 1726867225.83782: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12033 1726867225.83822: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-1203351msjvvn/tmpcwylp8wh /root/.ansible/tmp/ansible-tmp-1726867225.776319-15065-23237075871224/AnsiballZ_command.py <<< 12033 1726867225.83830: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867225.776319-15065-23237075871224/AnsiballZ_command.py" <<< 12033 1726867225.83863: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-1203351msjvvn/tmpcwylp8wh" to remote "/root/.ansible/tmp/ansible-tmp-1726867225.776319-15065-23237075871224/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867225.776319-15065-23237075871224/AnsiballZ_command.py" <<< 12033 1726867225.84446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867225.84583: stderr chunk (state=3): >>><<< 12033 1726867225.84588: stdout chunk (state=3): >>><<< 12033 1726867225.84590: done transferring module to remote 12033 1726867225.84592: _low_level_execute_command(): starting 12033 1726867225.84594: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867225.776319-15065-23237075871224/ /root/.ansible/tmp/ansible-tmp-1726867225.776319-15065-23237075871224/AnsiballZ_command.py && sleep 0' 12033 1726867225.85070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12033 1726867225.85081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867225.85093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12033 1726867225.85108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867225.85147: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867225.85260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867225.85263: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867225.85265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867225.85307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867225.87091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867225.87095: stdout chunk (state=3): >>><<< 12033 1726867225.87097: stderr chunk (state=3): >>><<< 12033 1726867225.87125: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867225.87128: _low_level_execute_command(): starting 12033 1726867225.87131: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867225.776319-15065-23237075871224/AnsiballZ_command.py && sleep 0' 12033 1726867225.87719: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867225.87725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK <<< 12033 1726867225.87728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867225.87780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867226.29933: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 13773 0 --:--:-- --:--:-- --:--:-- 13863\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1296 0 --:--:-- --:--:-- --:--:-- 1299", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 17:20:26.027678", "end": "2024-09-20 17:20:26.296739", "delta": "0:00:00.269061", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12033 1726867226.31531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. <<< 12033 1726867226.31561: stderr chunk (state=3): >>><<< 12033 1726867226.31564: stdout chunk (state=3): >>><<< 12033 1726867226.31583: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 13773 0 --:--:-- --:--:-- --:--:-- 13863\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1296 0 --:--:-- --:--:-- --:--:-- 1299", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 17:20:26.027678", "end": "2024-09-20 17:20:26.296739", "delta": "0:00:00.269061", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.68 closed. 12033 1726867226.31624: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867225.776319-15065-23237075871224/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12033 1726867226.31630: _low_level_execute_command(): starting 12033 1726867226.31635: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867225.776319-15065-23237075871224/ > /dev/null 2>&1 && sleep 0' 12033 1726867226.32066: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867226.32102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12033 1726867226.32106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found <<< 12033 1726867226.32108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867226.32110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12033 1726867226.32112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found <<< 12033 1726867226.32114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12033 1726867226.32170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' <<< 12033 1726867226.32173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12033 1726867226.32214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12033 1726867226.34034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12033 1726867226.34057: stderr chunk (state=3): >>><<< 12033 1726867226.34060: stdout chunk (state=3): >>><<< 12033 1726867226.34072: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.68 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.68 originally 10.31.15.68 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/2615b8b480' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12033 1726867226.34083: handler run complete 12033 1726867226.34103: Evaluated conditional (False): False 12033 1726867226.34113: attempt loop complete, returning result 12033 1726867226.34116: _execute() done 12033 1726867226.34119: dumping result to json 12033 1726867226.34124: done dumping result, returning 12033 1726867226.34132: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [0affcac9-a3a5-74bb-502b-000000000e5d] 12033 1726867226.34137: sending task result for task 0affcac9-a3a5-74bb-502b-000000000e5d 12033 1726867226.34236: done sending task result for task 0affcac9-a3a5-74bb-502b-000000000e5d 12033 1726867226.34239: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.269061", "end": "2024-09-20 17:20:26.296739", "rc": 0, "start": "2024-09-20 17:20:26.027678" } STDOUT: CHECK DNS AND CONNECTIVITY 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 13773 0 --:--:-- --:--:-- --:--:-- 13863 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1296 0 --:--:-- --:--:-- --:--:-- 1299 12033 1726867226.34304: no more pending results, returning what we have 12033 1726867226.34308: results queue empty 12033 1726867226.34309: checking for any_errors_fatal 12033 1726867226.34318: done checking for any_errors_fatal 12033 1726867226.34319: checking for max_fail_percentage 12033 1726867226.34321: done checking for max_fail_percentage 12033 1726867226.34322: checking to see if all hosts have failed and the running result is not ok 12033 1726867226.34323: done checking to see if all hosts have failed 12033 1726867226.34323: getting the remaining hosts for this loop 12033 1726867226.34329: done getting the remaining hosts for this loop 12033 1726867226.34333: getting the next task for host managed_node3 12033 1726867226.34344: done getting next task for host managed_node3 12033 1726867226.34345: ^ task is: TASK: meta (flush_handlers) 12033 1726867226.34348: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867226.34352: getting variables 12033 1726867226.34354: in VariableManager get_vars() 12033 1726867226.34407: Calling all_inventory to load vars for managed_node3 12033 1726867226.34410: Calling groups_inventory to load vars for managed_node3 12033 1726867226.34413: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867226.34423: Calling all_plugins_play to load vars for managed_node3 12033 1726867226.34426: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867226.34428: Calling groups_plugins_play to load vars for managed_node3 12033 1726867226.39910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867226.42124: done with get_vars() 12033 1726867226.42148: done getting variables 12033 1726867226.42211: in VariableManager get_vars() 12033 1726867226.42228: Calling all_inventory to load vars for managed_node3 12033 1726867226.42231: Calling groups_inventory to load vars for managed_node3 12033 1726867226.42233: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867226.42238: Calling all_plugins_play to load vars for managed_node3 12033 1726867226.42240: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867226.42243: Calling groups_plugins_play to load vars for managed_node3 12033 1726867226.43535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867226.45936: done with get_vars() 12033 1726867226.45964: done queuing things up, now waiting for results queue to drain 12033 1726867226.45967: results queue empty 12033 1726867226.45967: checking for any_errors_fatal 12033 1726867226.45971: done checking for any_errors_fatal 12033 1726867226.45972: checking for max_fail_percentage 12033 1726867226.45973: done checking for max_fail_percentage 12033 1726867226.45974: checking to see if all hosts have failed and the running result is not ok 12033 1726867226.45974: done checking to see if all hosts have failed 12033 1726867226.45975: getting the remaining hosts for this loop 12033 1726867226.45976: done getting the remaining hosts for this loop 12033 1726867226.45981: getting the next task for host managed_node3 12033 1726867226.45984: done getting next task for host managed_node3 12033 1726867226.45986: ^ task is: TASK: meta (flush_handlers) 12033 1726867226.45987: ^ state is: HOST STATE: block=7, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867226.45989: getting variables 12033 1726867226.45990: in VariableManager get_vars() 12033 1726867226.46009: Calling all_inventory to load vars for managed_node3 12033 1726867226.46011: Calling groups_inventory to load vars for managed_node3 12033 1726867226.46014: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867226.46019: Calling all_plugins_play to load vars for managed_node3 12033 1726867226.46021: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867226.46024: Calling groups_plugins_play to load vars for managed_node3 12033 1726867226.48445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867226.51633: done with get_vars() 12033 1726867226.51654: done getting variables 12033 1726867226.51835: in VariableManager get_vars() 12033 1726867226.51853: Calling all_inventory to load vars for managed_node3 12033 1726867226.51856: Calling groups_inventory to load vars for managed_node3 12033 1726867226.51858: Calling all_plugins_inventory to load vars for managed_node3 12033 1726867226.51862: Calling all_plugins_play to load vars for managed_node3 12033 1726867226.51865: Calling groups_plugins_inventory to load vars for managed_node3 12033 1726867226.51867: Calling groups_plugins_play to load vars for managed_node3 12033 1726867226.53051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12033 1726867226.54778: done with get_vars() 12033 1726867226.54805: done queuing things up, now waiting for results queue to drain 12033 1726867226.54807: results queue empty 12033 1726867226.54808: checking for any_errors_fatal 12033 1726867226.54809: done checking for any_errors_fatal 12033 1726867226.54810: checking for max_fail_percentage 12033 1726867226.54811: done checking for max_fail_percentage 12033 1726867226.54812: checking to see if all hosts have failed and the running result is not ok 12033 1726867226.54812: done checking to see if all hosts have failed 12033 1726867226.54813: getting the remaining hosts for this loop 12033 1726867226.54814: done getting the remaining hosts for this loop 12033 1726867226.54817: getting the next task for host managed_node3 12033 1726867226.54820: done getting next task for host managed_node3 12033 1726867226.54821: ^ task is: None 12033 1726867226.54822: ^ state is: HOST STATE: block=8, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12033 1726867226.54823: done queuing things up, now waiting for results queue to drain 12033 1726867226.54824: results queue empty 12033 1726867226.54825: checking for any_errors_fatal 12033 1726867226.54826: done checking for any_errors_fatal 12033 1726867226.54826: checking for max_fail_percentage 12033 1726867226.54827: done checking for max_fail_percentage 12033 1726867226.54828: checking to see if all hosts have failed and the running result is not ok 12033 1726867226.54829: done checking to see if all hosts have failed 12033 1726867226.54830: getting the next task for host managed_node3 12033 1726867226.54833: done getting next task for host managed_node3 12033 1726867226.54834: ^ task is: None 12033 1726867226.54835: ^ state is: HOST STATE: block=8, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=148 changed=4 unreachable=0 failed=0 skipped=97 rescued=0 ignored=0 Friday 20 September 2024 17:20:26 -0400 (0:00:00.850) 0:01:05.666 ****** =============================================================================== ** TEST check bond settings --------------------------------------------- 5.66s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 ** TEST check IPv4 ------------------------------------------------------ 2.80s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 2.02s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.85s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.80s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.80s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 ** TEST check bond settings --------------------------------------------- 1.74s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Create test interfaces -------------------------------------------------- 1.70s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 fedora.linux_system_roles.network : Check which packages are installed --- 1.66s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Create test interfaces -------------------------------------------------- 1.58s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.24s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.21s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:6 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.20s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.17s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.94s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Install dnsmasq --------------------------------------------------------- 0.87s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Verify DNS and network connectivity ------------------------------------- 0.85s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Gathering Facts --------------------------------------------------------- 0.84s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.84s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.82s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 12033 1726867226.55156: RUNNING CLEANUP